Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
11,284
| 14,079,665,700
|
IssuesEvent
|
2020-11-04 15:09:36
|
googleapis/python-bigquery
|
https://api.github.com/repos/googleapis/python-bigquery
|
closed
|
refactor: split job.py and test_job.py
|
api: bigquery type: process
|
job.py and test_job.py are both thousands of lines long. It's actually causing my code editor to slow down. These files have a natural split into the various job types:
* _base -- for the _AsyncJob base class
* extract
* copy
* load
* query
The `job/__init__.py` should import all of these sub-modules to retain backwards compatibility.
Might want to wait for https://github.com/googleapis/python-bigquery/pull/41 and https://github.com/googleapis/python-bigquery/pull/347 to be merged before working on this.
|
1.0
|
refactor: split job.py and test_job.py - job.py and test_job.py are both thousands of lines long. It's actually causing my code editor to slow down. These files have a natural split into the various job types:
* _base -- for the _AsyncJob base class
* extract
* copy
* load
* query
The `job/__init__.py` should import all of these sub-modules to retain backwards compatibility.
Might want to wait for https://github.com/googleapis/python-bigquery/pull/41 and https://github.com/googleapis/python-bigquery/pull/347 to be merged before working on this.
|
process
|
refactor split job py and test job py job py and test job py are both thousands of lines long it s actually causing my code editor to slow down these files have a natural split into the various job types base for the asyncjob base class extract copy load query the job init py should import all of these sub modules to retain backwards compatibility might want to wait for and to be merged before working on this
| 1
|
15,866
| 20,036,239,295
|
IssuesEvent
|
2022-02-02 12:13:06
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
iOS 15 > Sign in page is not responsive post clicking on 'Forgot passcode? sign in again' link
|
Bug P2 iOS Process: Fixed Process: Tested dev
|
**Steps:**
1. Install the app in iOS 15+ compatible version
2. Login/Signup
3. Navigated to studies list
4. Minimize and resume the app
5. Click on 'Forgot passcode? Signin again' link
6. Observe entire sign in page is not responsive
**Actual:** Sign in page is not responsive post clicking on 'Forgot passcode? sign in again' link
**Expected:** Page should be responsive
1. iOS Version: 15.0.1
2. Issue was not observed in < iOS 15 versions
**Refer video:**
https://user-images.githubusercontent.com/60386291/135969592-4383d0b2-c6a3-4f68-b81e-35020d2b9fc1.MOV
|
2.0
|
iOS 15 > Sign in page is not responsive post clicking on 'Forgot passcode? sign in again' link - **Steps:**
1. Install the app in iOS 15+ compatible version
2. Login/Signup
3. Navigated to studies list
4. Minimize and resume the app
5. Click on 'Forgot passcode? Signin again' link
6. Observe entire sign in page is not responsive
**Actual:** Sign in page is not responsive post clicking on 'Forgot passcode? sign in again' link
**Expected:** Page should be responsive
1. iOS Version: 15.0.1
2. Issue was not observed in < iOS 15 versions
**Refer video:**
https://user-images.githubusercontent.com/60386291/135969592-4383d0b2-c6a3-4f68-b81e-35020d2b9fc1.MOV
|
process
|
ios sign in page is not responsive post clicking on forgot passcode sign in again link steps install the app in ios compatible version login signup navigated to studies list minimize and resume the app click on forgot passcode signin again link observe entire sign in page is not responsive actual sign in page is not responsive post clicking on forgot passcode sign in again link expected page should be responsive ios version issue was not observed in ios versions refer video
| 1
|
279,092
| 30,702,447,442
|
IssuesEvent
|
2023-07-27 01:30:58
|
artsking/packages_apps_settings_10.0.0_r33
|
https://api.github.com/repos/artsking/packages_apps_settings_10.0.0_r33
|
reopened
|
CVE-2022-20529 (Low) detected in Settingsandroid-10.0.0_r33
|
Mend: dependency security vulnerability
|
## CVE-2022-20529 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Settingsandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/packages/apps/Settings>https://android.googlesource.com/platform/packages/apps/Settings</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/packages_apps_settings_10.0.0_r33/commit/081b5699d08adc751bd29d01eff86bb13c550019">081b5699d08adc751bd29d01eff86bb13c550019</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/com/android/settings/wifi/WifiDialogActivity.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In multiple locations of WifiDialogActivity.java, there is a possible limited lockscreen bypass due to a logic error in the code. This could lead to local escalation of privilege in wifi settings with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-13Android ID: A-231583603
<p>Publish Date: 2022-12-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-20529>CVE-2022-20529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>2.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/8b83094ae3080495f6f95d0a1549ccc5594b5354">https://android.googlesource.com/platform/frameworks/base/+/8b83094ae3080495f6f95d0a1549ccc5594b5354</a></p>
<p>Release Date: 2022-12-16</p>
<p>Fix Resolution: android-13.0.0_r16</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-20529 (Low) detected in Settingsandroid-10.0.0_r33 - ## CVE-2022-20529 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Settingsandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/packages/apps/Settings>https://android.googlesource.com/platform/packages/apps/Settings</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/packages_apps_settings_10.0.0_r33/commit/081b5699d08adc751bd29d01eff86bb13c550019">081b5699d08adc751bd29d01eff86bb13c550019</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/com/android/settings/wifi/WifiDialogActivity.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In multiple locations of WifiDialogActivity.java, there is a possible limited lockscreen bypass due to a logic error in the code. This could lead to local escalation of privilege in wifi settings with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-13Android ID: A-231583603
<p>Publish Date: 2022-12-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-20529>CVE-2022-20529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>2.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/8b83094ae3080495f6f95d0a1549ccc5594b5354">https://android.googlesource.com/platform/frameworks/base/+/8b83094ae3080495f6f95d0a1549ccc5594b5354</a></p>
<p>Release Date: 2022-12-16</p>
<p>Fix Resolution: android-13.0.0_r16</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve low detected in settingsandroid cve low severity vulnerability vulnerable library settingsandroid library home page a href found in head commit a href found in base branch master vulnerable source files src com android settings wifi wifidialogactivity java vulnerability details in multiple locations of wifidialogactivity java there is a possible limited lockscreen bypass due to a logic error in the code this could lead to local escalation of privilege in wifi settings with no additional execution privileges needed user interaction is not needed for exploitation product androidversions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with mend
| 0
|
89,145
| 8,196,215,816
|
IssuesEvent
|
2018-08-31 09:06:32
|
junit-team/junit5
|
https://api.github.com/repos/junit-team/junit5
|
closed
|
junit-json-params
|
3rd-party: pioneer component: Jupiter theme: parameterized tests
|
## Overview
I was thinking about the CsvSource a while back and how JSON is sometimes a more convenient / expressive format to write data in, particularly when dealing with the input and output of web based APIs. So I created a small library, [junit-json-params](https://github.com/joshka/junit-json-params), which achieves this.
I'm wondering whether this is something that would be considered for being added to the JUnit project? If not, commenting here mainly for a little visibility on the library (and a call out for ideas to make this better).
|
1.0
|
junit-json-params - ## Overview
I was thinking about the CsvSource a while back and how JSON is sometimes a more convenient / expressive format to write data in, particularly when dealing with the input and output of web based APIs. So I created a small library, [junit-json-params](https://github.com/joshka/junit-json-params), which achieves this.
I'm wondering whether this is something that would be considered for being added to the JUnit project? If not, commenting here mainly for a little visibility on the library (and a call out for ideas to make this better).
|
non_process
|
junit json params overview i was thinking about the csvsource a while back and how json is sometimes a more convenient expressive format to write data in particularly when dealing with the input and output of web based apis so i created a small library which achieves this i m wondering whether this is something that would be considered for being added to the junit project if not commenting here mainly for a little visibility on the library and a call out for ideas to make this better
| 0
|
179,155
| 30,175,288,410
|
IssuesEvent
|
2023-07-04 03:37:32
|
chromium/subspace
|
https://api.github.com/repos/chromium/subspace
|
opened
|
Consider dropping FnMutBoxs from Iterators
|
design
|
It's painful to have to heap allocate to use capturing lambdas.
If each Iterator method that returns an Iterator is marked with lifetime_bound, will Clang warn if we construct a lambda with a reference in it, put it into an Iterator, chain it into another Iterator, and leave the scope?
|
1.0
|
Consider dropping FnMutBoxs from Iterators - It's painful to have to heap allocate to use capturing lambdas.
If each Iterator method that returns an Iterator is marked with lifetime_bound, will Clang warn if we construct a lambda with a reference in it, put it into an Iterator, chain it into another Iterator, and leave the scope?
|
non_process
|
consider dropping fnmutboxs from iterators it s painful to have to heap allocate to use capturing lambdas if each iterator method that returns an iterator is marked with lifetime bound will clang warn if we construct a lambda with a reference in it put it into an iterator chain it into another iterator and leave the scope
| 0
|
5,077
| 7,873,176,857
|
IssuesEvent
|
2018-06-25 13:36:34
|
threefoldfoundation/tfchain
|
https://api.github.com/repos/threefoldfoundation/tfchain
|
closed
|
Synced flag on testnet block creators doesn't toggle
|
process_wontfix type_bug
|
The block creators are deployed in the testlab, connected to eachother using the local IP's.
|
1.0
|
Synced flag on testnet block creators doesn't toggle - The block creators are deployed in the testlab, connected to eachother using the local IP's.
|
process
|
synced flag on testnet block creators doesn t toggle the block creators are deployed in the testlab connected to eachother using the local ip s
| 1
|
9,372
| 12,374,206,463
|
IssuesEvent
|
2020-05-19 00:50:57
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Performance issues
|
log-processing
|
Not sure if this is a bug but I am seeking some clarification. I'm trying to run goaccess on a 5GB large log file (1 year of traffic on a smallish site, approximately 23M hits, most of it crawlers of course). I struggle with performance.
On my local (old) MacBook Pro it's using one core fully and starts at 12000 lines/s but after a minute starts falling down to 3000. This is in-memory (when I ran using disk and left it over night and checked in the morning, it was down at 400 lines/s and still not finished). Command is:
`goaccess -d -o report.html`
On my server I tried running it in Docker, it did not fall down but instead Docker socket disconnects after a while:
`read unix @->/var/run/docker.sock: read: connection reset by peer`
Any clues or tricks I should think of? Can goaccess use more than one core? Is the slowdown I see normal? Have the docker issues been seen before?
|
1.0
|
Performance issues - Not sure if this is a bug but I am seeking some clarification. I'm trying to run goaccess on a 5GB large log file (1 year of traffic on a smallish site, approximately 23M hits, most of it crawlers of course). I struggle with performance.
On my local (old) MacBook Pro it's using one core fully and starts at 12000 lines/s but after a minute starts falling down to 3000. This is in-memory (when I ran using disk and left it over night and checked in the morning, it was down at 400 lines/s and still not finished). Command is:
`goaccess -d -o report.html`
On my server I tried running it in Docker, it did not fall down but instead Docker socket disconnects after a while:
`read unix @->/var/run/docker.sock: read: connection reset by peer`
Any clues or tricks I should think of? Can goaccess use more than one core? Is the slowdown I see normal? Have the docker issues been seen before?
|
process
|
performance issues not sure if this is a bug but i am seeking some clarification i m trying to run goaccess on a large log file year of traffic on a smallish site approximately hits most of it crawlers of course i struggle with performance on my local old macbook pro it s using one core fully and starts at lines s but after a minute starts falling down to this is in memory when i ran using disk and left it over night and checked in the morning it was down at lines s and still not finished command is goaccess d o report html on my server i tried running it in docker it did not fall down but instead docker socket disconnects after a while read unix var run docker sock read connection reset by peer any clues or tricks i should think of can goaccess use more than one core is the slowdown i see normal have the docker issues been seen before
| 1
|
13,458
| 15,936,748,119
|
IssuesEvent
|
2021-04-14 11:34:23
|
scikit-learn/scikit-learn
|
https://api.github.com/repos/scikit-learn/scikit-learn
|
closed
|
BUG: Regression with StandardScaler due to #19527
|
Bug Regression module:preprocessing
|
#### Describe the bug
#19527 introduced a regression with StandardScaler when dealing with data with small magnitudes.
#### Steps/Code to Reproduce
In MNE-Python some of our data channels have magnitudes in the ~1e-13 range. On 638b7689bbbfae4bcc4592c6f8a43ce86b571f0b or before, this code (which uses random data of different scales) returns all True, which seems correct:
```
import numpy as np
from sklearn.preprocessing import StandardScaler
for scale in (1e15, 1e10, 1e5, 1, 1e-5, 1e-10, 1e-15):
data = np.random.RandomState(0).rand(1000, 4) - 0.5
data *= scale
scaler = StandardScaler(with_mean=True, with_std=True)
X = scaler.fit_transform(data)
stds = np.std(data, axis=0)
means = np.mean(data, axis=0)
print(np.allclose(X, (data - means) / stds, rtol=1e-7, atol=1e-7 * scale))
```
But on c748e465c76c43a173ad5ab2fd82639210f8e895 / after #19527, anything "too small" starts to fail, as I get 5 True and the last two scale factors (1e-10, 1e-15) False. Hence `StandardScaler` no longer standardizes the data.
cc @ogrisel since this came from your PR and @maikia @rth @agramfort since you approved the PR
|
1.0
|
BUG: Regression with StandardScaler due to #19527 - #### Describe the bug
#19527 introduced a regression with StandardScaler when dealing with data with small magnitudes.
#### Steps/Code to Reproduce
In MNE-Python some of our data channels have magnitudes in the ~1e-13 range. On 638b7689bbbfae4bcc4592c6f8a43ce86b571f0b or before, this code (which uses random data of different scales) returns all True, which seems correct:
```
import numpy as np
from sklearn.preprocessing import StandardScaler
for scale in (1e15, 1e10, 1e5, 1, 1e-5, 1e-10, 1e-15):
data = np.random.RandomState(0).rand(1000, 4) - 0.5
data *= scale
scaler = StandardScaler(with_mean=True, with_std=True)
X = scaler.fit_transform(data)
stds = np.std(data, axis=0)
means = np.mean(data, axis=0)
print(np.allclose(X, (data - means) / stds, rtol=1e-7, atol=1e-7 * scale))
```
But on c748e465c76c43a173ad5ab2fd82639210f8e895 / after #19527, anything "too small" starts to fail, as I get 5 True and the last two scale factors (1e-10, 1e-15) False. Hence `StandardScaler` no longer standardizes the data.
cc @ogrisel since this came from your PR and @maikia @rth @agramfort since you approved the PR
|
process
|
bug regression with standardscaler due to describe the bug introduced a regression with standardscaler when dealing with data with small magnitudes steps code to reproduce in mne python some of our data channels have magnitudes in the range on or before this code which uses random data of different scales returns all true which seems correct import numpy as np from sklearn preprocessing import standardscaler for scale in data np random randomstate rand data scale scaler standardscaler with mean true with std true x scaler fit transform data stds np std data axis means np mean data axis print np allclose x data means stds rtol atol scale but on after anything too small starts to fail as i get true and the last two scale factors false hence standardscaler no longer standardizes the data cc ogrisel since this came from your pr and maikia rth agramfort since you approved the pr
| 1
|
16,530
| 2,615,118,207
|
IssuesEvent
|
2015-03-01 05:43:48
|
chrsmith/google-api-java-client
|
https://api.github.com/repos/chrsmith/google-api-java-client
|
closed
|
Google Docs CRUD
|
auto-migrated Priority-Medium Type-Sample
|
```
Which API and version (e.g. Google Calendar Data API version 2)?
Document List API - 3
What format (e.g. JSON, Atom)?
Atom
What Authentation (e.g. OAuth, OAuth 2, Android, ClientLogin)?
Android
Java environment (e.g. Java 6, Android 2.2, App Engine 1.3.7)?
Android 2.2
External references, such as API reference guide?
Please provide any additional information below.
I have tried for a couple days to post a request to the document list api with
this library without success. I can read list feeds and update existing notes,
but all attempts to create new notes return in 400 bad request errors. I have
followed the form of several other examples that are posted for creating new
entries, but this one seems to be unique. I've done this successfully in the
past by forming my own requests, but I really would like to use this library
for a project and I just need to know how this is done. My current method is to
create an AtomContent object, set the entry to be my new Entry object that
simply has a new title and I've tried posting to
https://docs.google.com/feeds/default/private/full as well as to some folders.
Thank you.
```
Original issue reported on code.google.com by `dcgraham7` on 31 Dec 2010 at 6:07
* Merged into: #18
|
1.0
|
Google Docs CRUD - ```
Which API and version (e.g. Google Calendar Data API version 2)?
Document List API - 3
What format (e.g. JSON, Atom)?
Atom
What Authentation (e.g. OAuth, OAuth 2, Android, ClientLogin)?
Android
Java environment (e.g. Java 6, Android 2.2, App Engine 1.3.7)?
Android 2.2
External references, such as API reference guide?
Please provide any additional information below.
I have tried for a couple days to post a request to the document list api with
this library without success. I can read list feeds and update existing notes,
but all attempts to create new notes return in 400 bad request errors. I have
followed the form of several other examples that are posted for creating new
entries, but this one seems to be unique. I've done this successfully in the
past by forming my own requests, but I really would like to use this library
for a project and I just need to know how this is done. My current method is to
create an AtomContent object, set the entry to be my new Entry object that
simply has a new title and I've tried posting to
https://docs.google.com/feeds/default/private/full as well as to some folders.
Thank you.
```
Original issue reported on code.google.com by `dcgraham7` on 31 Dec 2010 at 6:07
* Merged into: #18
|
non_process
|
google docs crud which api and version e g google calendar data api version document list api what format e g json atom atom what authentation e g oauth oauth android clientlogin android java environment e g java android app engine android external references such as api reference guide please provide any additional information below i have tried for a couple days to post a request to the document list api with this library without success i can read list feeds and update existing notes but all attempts to create new notes return in bad request errors i have followed the form of several other examples that are posted for creating new entries but this one seems to be unique i ve done this successfully in the past by forming my own requests but i really would like to use this library for a project and i just need to know how this is done my current method is to create an atomcontent object set the entry to be my new entry object that simply has a new title and i ve tried posting to as well as to some folders thank you original issue reported on code google com by on dec at merged into
| 0
|
19,330
| 25,472,542,549
|
IssuesEvent
|
2022-11-25 11:28:02
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IDP] [PM] A popup message is getting displayed when adding a new admin in the PM
|
Bug P1 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
**Pre-condition:** IDP should be disabled in the PM
**Steps:**
1. Login to PM
2. Click on 'Admins' tab
3. Click on 'Add new admin' button
4. Complete all the fields
5. Click on 'Add admin user and invite' button and Verify
**AR:** A popup message is getting displayed when adding a new admin in the PM
**ER:** A popup message should not get displayed when adding a new admin in the PM

|
3.0
|
[IDP] [PM] A popup message is getting displayed when adding a new admin in the PM - **Pre-condition:** IDP should be disabled in the PM
**Steps:**
1. Login to PM
2. Click on 'Admins' tab
3. Click on 'Add new admin' button
4. Complete all the fields
5. Click on 'Add admin user and invite' button and Verify
**AR:** A popup message is getting displayed when adding a new admin in the PM
**ER:** A popup message should not get displayed when adding a new admin in the PM

|
process
|
a popup message is getting displayed when adding a new admin in the pm pre condition idp should be disabled in the pm steps login to pm click on admins tab click on add new admin button complete all the fields click on add admin user and invite button and verify ar a popup message is getting displayed when adding a new admin in the pm er a popup message should not get displayed when adding a new admin in the pm
| 1
|
1,385
| 3,952,516,495
|
IssuesEvent
|
2016-04-29 09:11:15
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
reopened
|
stdout/stderr buffering considerations
|
discuss libuv net process
|
_I tried to discuss this some time ago at IRC, but postponed it for quite a long time. Also I started the discussion of this in #1741, but I would like to extract the more specific discussion to a separate issue._
I could miss some details, but will try to give a quick overview here.
Several issues here:
1. Many calls to `console.log` (e.g. calling it in a loop) could chew up all the memory and die — #1741, #2970, #3171.
2. `console.log` has different behavior while printing to a terminal and being redirected to a file. — https://github.com/nodejs/node/issues/1741#issuecomment-105333932.
3. Output is sometimes truncated — #6297, there were other ones as far as I remember.
4. The behaviour seems to differ across platforms.
As I understand it — the output has an implicit write buffer (as it's non-blocking) of unlimited size.
One approach to fixing this would be to:
1. Introduce an explicit cyclic write buffer.
2. Make writes to that cyclic buffer blocking.
3. Make writes from the buffer to the actual output non blocking.
4. When the cyclic buffer reaches it's maximum size (e.g. 10 MiB) — block further writes to the buffer until a corresponding part of it is freed.
5. On (normal) exit, make sure the buffer is flushed.
For almost all cases, except for the ones that are currently broken, this would behave as a non-blocking buffer (because writes to the buffer are considerably faster than writes from the buffer to file/terminal).
For cases when the data is being piped to the output too quickly and when the output file/terminal does not manage to output it at the same rate — the write would turn into a blocking operation. It would also be blocking at the exit until all the data is written.
--
Another approach would be to monitor (and limit) the size of data that is contained in the implicit buffer coming from the async queue, and make the operations block when that limit is reached.
|
1.0
|
stdout/stderr buffering considerations - _I tried to discuss this some time ago at IRC, but postponed it for quite a long time. Also I started the discussion of this in #1741, but I would like to extract the more specific discussion to a separate issue._
I could miss some details, but will try to give a quick overview here.
Several issues here:
1. Many calls to `console.log` (e.g. calling it in a loop) could chew up all the memory and die — #1741, #2970, #3171.
2. `console.log` has different behavior while printing to a terminal and being redirected to a file. — https://github.com/nodejs/node/issues/1741#issuecomment-105333932.
3. Output is sometimes truncated — #6297, there were other ones as far as I remember.
4. The behaviour seems to differ across platforms.
As I understand it — the output has an implicit write buffer (as it's non-blocking) of unlimited size.
One approach to fixing this would be to:
1. Introduce an explicit cyclic write buffer.
2. Make writes to that cyclic buffer blocking.
3. Make writes from the buffer to the actual output non blocking.
4. When the cyclic buffer reaches it's maximum size (e.g. 10 MiB) — block further writes to the buffer until a corresponding part of it is freed.
5. On (normal) exit, make sure the buffer is flushed.
For almost all cases, except for the ones that are currently broken, this would behave as a non-blocking buffer (because writes to the buffer are considerably faster than writes from the buffer to file/terminal).
For cases when the data is being piped to the output too quickly and when the output file/terminal does not manage to output it at the same rate — the write would turn into a blocking operation. It would also be blocking at the exit until all the data is written.
--
Another approach would be to monitor (and limit) the size of data that is contained in the implicit buffer coming from the async queue, and make the operations block when that limit is reached.
|
process
|
stdout stderr buffering considerations i tried to discuss this some time ago at irc but postponed it for quite a long time also i started the discussion of this in but i would like to extract the more specific discussion to a separate issue i could miss some details but will try to give a quick overview here several issues here many calls to console log e g calling it in a loop could chew up all the memory and die — console log has different behavior while printing to a terminal and being redirected to a file — output is sometimes truncated — there were other ones as far as i remember the behaviour seems to differ across platforms as i understand it — the output has an implicit write buffer as it s non blocking of unlimited size one approach to fixing this would be to introduce an explicit cyclic write buffer make writes to that cyclic buffer blocking make writes from the buffer to the actual output non blocking when the cyclic buffer reaches it s maximum size e g mib — block further writes to the buffer until a corresponding part of it is freed on normal exit make sure the buffer is flushed for almost all cases except for the ones that are currently broken this would behave as a non blocking buffer because writes to the buffer are considerably faster than writes from the buffer to file terminal for cases when the data is being piped to the output too quickly and when the output file terminal does not manage to output it at the same rate — the write would turn into a blocking operation it would also be blocking at the exit until all the data is written another approach would be to monitor and limit the size of data that is contained in the implicit buffer coming from the async queue and make the operations block when that limit is reached
| 1
|
500,511
| 14,500,949,810
|
IssuesEvent
|
2020-12-11 18:46:37
|
monarch-initiative/mondo
|
https://api.github.com/repos/monarch-initiative/mondo
|
closed
|
Consider changing mental retardation to intellectual disability
|
high priority
|
This would be consistent with GARD
https://github.com/obophenotype/human-phenotype-ontology/issues/3290
|
1.0
|
Consider changing mental retardation to intellectual disability - This would be consistent with GARD
https://github.com/obophenotype/human-phenotype-ontology/issues/3290
|
non_process
|
consider changing mental retardation to intellectual disability this would be consistent with gard
| 0
|
18,333
| 24,453,203,017
|
IssuesEvent
|
2022-10-07 02:37:24
|
pyanodon/pybugreports
|
https://api.github.com/repos/pyanodon/pybugreports
|
closed
|
Crystal Mine Can Take Nonsensical Modules
|
bug mod:pycoalprocessing
|
### Mod source
PyAE Beta
### Which mod are you having an issue with?
- [ ] pyalienlife
- [ ] pyalternativeenergy
- [ ] pycoalprocessing
- [ ] pyfusionenergy
- [ ] pyhightech
- [ ] pyindustry
- [ ] pypetroleumhandling
- [ ] pypostprocessing
- [ ] pyrawores
### Operating system
>=Windows 10
### What kind of issue is this?
- [ ] Compatibility
- [ ] Locale (names, descriptions, unknown keys)
- [ ] Graphical
- [ ] Crash
- [ ] Progression
- [ ] Balance
- [ ] Pypostprocessing failure
- [X] Other
### What is the problem?
The crystal mine can take nonsensical modules. See the picture:

### Steps to reproduce
_No response_
### Additional context
_No response_
### Log file
_No response_
|
1.0
|
Crystal Mine Can Take Nonsensical Modules - ### Mod source
PyAE Beta
### Which mod are you having an issue with?
- [ ] pyalienlife
- [ ] pyalternativeenergy
- [ ] pycoalprocessing
- [ ] pyfusionenergy
- [ ] pyhightech
- [ ] pyindustry
- [ ] pypetroleumhandling
- [ ] pypostprocessing
- [ ] pyrawores
### Operating system
>=Windows 10
### What kind of issue is this?
- [ ] Compatibility
- [ ] Locale (names, descriptions, unknown keys)
- [ ] Graphical
- [ ] Crash
- [ ] Progression
- [ ] Balance
- [ ] Pypostprocessing failure
- [X] Other
### What is the problem?
The crystal mine can take nonsensical modules. See the picture:

### Steps to reproduce
_No response_
### Additional context
_No response_
### Log file
_No response_
|
process
|
crystal mine can take nonsensical modules mod source pyae beta which mod are you having an issue with pyalienlife pyalternativeenergy pycoalprocessing pyfusionenergy pyhightech pyindustry pypetroleumhandling pypostprocessing pyrawores operating system windows what kind of issue is this compatibility locale names descriptions unknown keys graphical crash progression balance pypostprocessing failure other what is the problem the crystal mine can take nonsensical modules see the picture steps to reproduce no response additional context no response log file no response
| 1
|
93,841
| 15,946,423,097
|
IssuesEvent
|
2021-04-15 01:02:02
|
jgeraigery/core
|
https://api.github.com/repos/jgeraigery/core
|
opened
|
CVE-2021-21290 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-21290 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-all-4.1.2.Final.jar</b>, <b>netty-handler-4.1.29.Final.jar</b>, <b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-all-4.1.2.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-core/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar,canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar,canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-all-4.1.2.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-handler-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-test/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- qpid-jms-client-0.41.0.jar (Root Library)
- :x: **netty-handler-4.1.29.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-test/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- qpid-jms-client-0.41.0.jar (Root Library)
- :x: **netty-codec-http-4.1.29.Final.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty's multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method "File.createTempFile" on unix-like systems creates a random file, but, by default will create this file with the permissions "-rw-r--r--". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty's "AbstractDiskHttpData" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own "java.io.tmpdir" when you start the JVM or use "DefaultHttpDataFactory.setBaseDir(...)" to set the directory to something that is only readable by the current user.
<p>Publish Date: 2021-02-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290>CVE-2021-21290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2">https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2</a></p>
<p>Release Date: 2021-02-08</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.59.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.1.2.Final","packageFilePaths":["/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml","/nimbus-test/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-all:4.1.2.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-handler","packageVersion":"4.1.29.Final","packageFilePaths":["/nimbus-test/pom.xml","/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.qpid:qpid-jms-client:0.41.0;io.netty:netty-handler:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.29.Final","packageFilePaths":["/nimbus-test/pom.xml","/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.qpid:qpid-jms-client:0.41.0;io.netty:netty-codec-http:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21290","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty\u0027s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method \"File.createTempFile\" on unix-like systems creates a random file, but, by default will create this file with the permissions \"-rw-r--r--\". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty\u0027s \"AbstractDiskHttpData\" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own \"java.io.tmpdir\" when you start the JVM or use \"DefaultHttpDataFactory.setBaseDir(...)\" to set the directory to something that is only readable by the current user.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-21290 (Medium) detected in multiple libraries - ## CVE-2021-21290 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-all-4.1.2.Final.jar</b>, <b>netty-handler-4.1.29.Final.jar</b>, <b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-all-4.1.2.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-core/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar,canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar,canner/.m2/repository/io/netty/netty-all/4.1.2.Final/netty-all-4.1.2.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-all-4.1.2.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-handler-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-test/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.29.Final/netty-handler-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- qpid-jms-client-0.41.0.jar (Root Library)
- :x: **netty-handler-4.1.29.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: core/nimbus-test/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec-http/4.1.29.Final/netty-codec-http-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- qpid-jms-client-0.41.0.jar (Root Library)
- :x: **netty-codec-http-4.1.29.Final.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty's multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method "File.createTempFile" on unix-like systems creates a random file, but, by default will create this file with the permissions "-rw-r--r--". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty's "AbstractDiskHttpData" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own "java.io.tmpdir" when you start the JVM or use "DefaultHttpDataFactory.setBaseDir(...)" to set the directory to something that is only readable by the current user.
<p>Publish Date: 2021-02-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290>CVE-2021-21290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2">https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2</a></p>
<p>Release Date: 2021-02-08</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.59.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.1.2.Final","packageFilePaths":["/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml","/nimbus-test/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-all:4.1.2.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-handler","packageVersion":"4.1.29.Final","packageFilePaths":["/nimbus-test/pom.xml","/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.qpid:qpid-jms-client:0.41.0;io.netty:netty-handler:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.29.Final","packageFilePaths":["/nimbus-test/pom.xml","/nimbus-core/pom.xml","/nimbus-entity-dsl/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.qpid:qpid-jms-client:0.41.0;io.netty:netty-codec-http:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21290","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty\u0027s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method \"File.createTempFile\" on unix-like systems creates a random file, but, by default will create this file with the permissions \"-rw-r--r--\". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty\u0027s \"AbstractDiskHttpData\" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own \"java.io.tmpdir\" when you start the JVM or use \"DefaultHttpDataFactory.setBaseDir(...)\" to set the directory to something that is only readable by the current user.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries netty all final jar netty handler final jar netty codec http final jar netty all final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file core nimbus core pom xml path to vulnerable library canner repository io netty netty all final netty all final jar canner repository io netty netty all final netty all final jar canner repository io netty netty all final netty all final jar dependency hierarchy x netty all final jar vulnerable library netty handler final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file core nimbus test pom xml path to vulnerable library home wss scanner repository io netty netty handler final netty handler final jar home wss scanner repository io netty netty handler final netty handler final jar home wss scanner repository io netty netty handler final netty handler final jar dependency hierarchy qpid jms client jar root library x netty handler final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file core nimbus test pom xml path to vulnerable library home wss scanner repository io netty netty codec http final netty codec http final jar home wss scanner repository io netty netty codec http final netty codec http final jar home wss scanner repository io netty netty codec http final netty codec http final jar dependency hierarchy qpid jms client jar root library x netty codec http final jar vulnerable library found in base branch master vulnerability details netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty before version final there is a vulnerability on unix like systems involving an insecure temp file when netty s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled on unix like systems the temporary directory is shared between all user as such writing to this directory using apis that do not explicitly set the file directory permissions can lead to information disclosure of note this does not impact modern macos operating systems the method file createtempfile on unix like systems creates a random file but by default will create this file with the permissions rw r r thus if sensitive information is written to this file other local users can read this information this is the case in netty s abstractdiskhttpdata is vulnerable this has been fixed in version final as a workaround one may specify your own java io tmpdir when you start the jvm or use defaulthttpdatafactory setbasedir to set the directory to something that is only readable by the current user publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http final isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree io netty netty all final isminimumfixversionavailable true minimumfixversion io netty netty codec http final packagetype java groupid io netty packagename netty handler packageversion final packagefilepaths istransitivedependency true dependencytree org apache qpid qpid jms client io netty netty handler final isminimumfixversionavailable true minimumfixversion io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org apache qpid qpid jms client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty codec http final basebranches vulnerabilityidentifier cve vulnerabilitydetails netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty before version final there is a vulnerability on unix like systems involving an insecure temp file when netty multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled on unix like systems the temporary directory is shared between all user as such writing to this directory using apis that do not explicitly set the file directory permissions can lead to information disclosure of note this does not impact modern macos operating systems the method file createtempfile on unix like systems creates a random file but by default will create this file with the permissions rw r r thus if sensitive information is written to this file other local users can read this information this is the case in netty abstractdiskhttpdata is vulnerable this has been fixed in version final as a workaround one may specify your own java io tmpdir when you start the jvm or use defaulthttpdatafactory setbasedir to set the directory to something that is only readable by the current user vulnerabilityurl
| 0
|
439,517
| 12,683,360,296
|
IssuesEvent
|
2020-06-19 19:33:47
|
JacquesCarette/Drasil
|
https://api.github.com/repos/JacquesCarette/Drasil
|
opened
|
Remove inputFunc from GOOL
|
Low Priority
|
GOOL's internal type class for values, `RenderValue`, includes `inputFunc`, meant to represent the call to a function for reading input from the command line.
This doesn't lead to ugly code like the similar `printFunc`, etc., methods do (see #2188), but I still don't see any benefit to having it as a type class method. I didn't use it in the Swift renderer and instead defined the input-reading function call as a regular Haskell function, and used that in the implementations for the input-reading functions that are actually exposed in GOOL's interface (`getInput` and `discardInput`). I think we should do the same for the other languages and then remove `inputFunc` from GOOL.
|
1.0
|
Remove inputFunc from GOOL - GOOL's internal type class for values, `RenderValue`, includes `inputFunc`, meant to represent the call to a function for reading input from the command line.
This doesn't lead to ugly code like the similar `printFunc`, etc., methods do (see #2188), but I still don't see any benefit to having it as a type class method. I didn't use it in the Swift renderer and instead defined the input-reading function call as a regular Haskell function, and used that in the implementations for the input-reading functions that are actually exposed in GOOL's interface (`getInput` and `discardInput`). I think we should do the same for the other languages and then remove `inputFunc` from GOOL.
|
non_process
|
remove inputfunc from gool gool s internal type class for values rendervalue includes inputfunc meant to represent the call to a function for reading input from the command line this doesn t lead to ugly code like the similar printfunc etc methods do see but i still don t see any benefit to having it as a type class method i didn t use it in the swift renderer and instead defined the input reading function call as a regular haskell function and used that in the implementations for the input reading functions that are actually exposed in gool s interface getinput and discardinput i think we should do the same for the other languages and then remove inputfunc from gool
| 0
|
16,548
| 21,568,599,050
|
IssuesEvent
|
2022-05-02 04:17:56
|
lynnandtonic/nestflix.fun
|
https://api.github.com/repos/lynnandtonic/nestflix.fun
|
closed
|
Add Roscoe
|
suggested title in process
|
Please add as much of the following info as you can:
Title:
Roscoe.
Type (film/tv show):
TV show.
Film or show in which it appears:
Mythic Quest.
Is the parent film/show streaming anywhere?
Apple TV+
About when in the parent film/show does it appear?
Season 1, episode 8, around 20 minutes in.
Actual footage of the film/show can be seen (yes/no)?
Yes.
|
1.0
|
Add Roscoe - Please add as much of the following info as you can:
Title:
Roscoe.
Type (film/tv show):
TV show.
Film or show in which it appears:
Mythic Quest.
Is the parent film/show streaming anywhere?
Apple TV+
About when in the parent film/show does it appear?
Season 1, episode 8, around 20 minutes in.
Actual footage of the film/show can be seen (yes/no)?
Yes.
|
process
|
add roscoe please add as much of the following info as you can title roscoe type film tv show tv show film or show in which it appears mythic quest is the parent film show streaming anywhere apple tv about when in the parent film show does it appear season episode around minutes in actual footage of the film show can be seen yes no yes
| 1
|
424,591
| 12,313,405,927
|
IssuesEvent
|
2020-05-12 15:15:43
|
soed2020-teamorange/ch.bfh.bti7081.s2020.orange
|
https://api.github.com/repos/soed2020-teamorange/ch.bfh.bti7081.s2020.orange
|
closed
|
UI für neuen Patienten erfassen
|
enhancement high priority ready implementation task
|
# Beschreibung
Es soll ein UI erstellt werden, bei welchem ein Therapeut einen neuen Patienten erfassen kann.
# Aufwand
Geschätzter Aufwand: 5h
Aktueller Aufwand:
Aktualisierter Aufwand:
# Sub Tasks
|
1.0
|
UI für neuen Patienten erfassen - # Beschreibung
Es soll ein UI erstellt werden, bei welchem ein Therapeut einen neuen Patienten erfassen kann.
# Aufwand
Geschätzter Aufwand: 5h
Aktueller Aufwand:
Aktualisierter Aufwand:
# Sub Tasks
|
non_process
|
ui für neuen patienten erfassen beschreibung es soll ein ui erstellt werden bei welchem ein therapeut einen neuen patienten erfassen kann aufwand geschätzter aufwand aktueller aufwand aktualisierter aufwand sub tasks
| 0
|
10,512
| 13,283,969,260
|
IssuesEvent
|
2020-08-24 05:02:08
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
opened
|
copr: fast-path Chunk format encoding
|
sig/coprocessor type/enhancement
|
## Development Task
After Chunk format is implemented [in this RFC](https://github.com/tikv/rfcs/pull/43), we can now directly construct TiDB Chunk format instead of one-by-one copying elements into Chunk.
This development task involves changing how `Column` is constructed. For `Int`, `Decimal`, `DateTime`, `Bytes` and `Json` VectorValue, they are stored exactly the same as specified in TiDB Chunk format. Hence, we could directly copy what's inside `ChunkedVec` to `Column`.
We'll need to:
* Add fast-path to `Column::from_vector_value`
* Add unit tests
I have worked on this before in https://github.com/tikv/tikv/pull/8300. However, I still couldn't figure out why this simple change causes all tests fail.
Another challenge is that `logical_rows` may not be identical. We must ensure `logical_rows` is identical before utilizing fast-path, which might cause extra overhead. This can be optimized with a new `LogicalRows` structure covered in another development task.
|
1.0
|
copr: fast-path Chunk format encoding - ## Development Task
After Chunk format is implemented [in this RFC](https://github.com/tikv/rfcs/pull/43), we can now directly construct TiDB Chunk format instead of one-by-one copying elements into Chunk.
This development task involves changing how `Column` is constructed. For `Int`, `Decimal`, `DateTime`, `Bytes` and `Json` VectorValue, they are stored exactly the same as specified in TiDB Chunk format. Hence, we could directly copy what's inside `ChunkedVec` to `Column`.
We'll need to:
* Add fast-path to `Column::from_vector_value`
* Add unit tests
I have worked on this before in https://github.com/tikv/tikv/pull/8300. However, I still couldn't figure out why this simple change causes all tests fail.
Another challenge is that `logical_rows` may not be identical. We must ensure `logical_rows` is identical before utilizing fast-path, which might cause extra overhead. This can be optimized with a new `LogicalRows` structure covered in another development task.
|
process
|
copr fast path chunk format encoding development task after chunk format is implemented we can now directly construct tidb chunk format instead of one by one copying elements into chunk this development task involves changing how column is constructed for int decimal datetime bytes and json vectorvalue they are stored exactly the same as specified in tidb chunk format hence we could directly copy what s inside chunkedvec to column we ll need to add fast path to column from vector value add unit tests i have worked on this before in however i still couldn t figure out why this simple change causes all tests fail another challenge is that logical rows may not be identical we must ensure logical rows is identical before utilizing fast path which might cause extra overhead this can be optimized with a new logicalrows structure covered in another development task
| 1
|
12,125
| 9,575,498,206
|
IssuesEvent
|
2019-05-07 06:36:57
|
microsoft/azure-pipelines-image-generation
|
https://api.github.com/repos/microsoft/azure-pipelines-image-generation
|
closed
|
Azure CLI version is not displayed correctly in generated readme file for image (VS2017/VS2019)
|
area:Deployment/DBs/services image:Windows issue:Bug report
|
When generating the VS2017 and VS2019 images, the Validate-AzureCli.ps1 script is supposed to generate the output used for the readme file that describes the image spec. The output should include the version number. However, I noticed that in the last VS2017 I generated, the version was not printed out, see:
https://github.com/Microsoft/azure-pipelines-image-generation/blob/releases/VS2017/149.1/images/win/Vs2017-Server2016-Readme.md#azure-cli
However, I verified that in the generated image Azure-Cli version 2.0.59 was properly installed (and reported by az --version.
**Image impacted**
VS2017 and VS2019 since they share the same script.
|
1.0
|
Azure CLI version is not displayed correctly in generated readme file for image (VS2017/VS2019) - When generating the VS2017 and VS2019 images, the Validate-AzureCli.ps1 script is supposed to generate the output used for the readme file that describes the image spec. The output should include the version number. However, I noticed that in the last VS2017 I generated, the version was not printed out, see:
https://github.com/Microsoft/azure-pipelines-image-generation/blob/releases/VS2017/149.1/images/win/Vs2017-Server2016-Readme.md#azure-cli
However, I verified that in the generated image Azure-Cli version 2.0.59 was properly installed (and reported by az --version.
**Image impacted**
VS2017 and VS2019 since they share the same script.
|
non_process
|
azure cli version is not displayed correctly in generated readme file for image when generating the and images the validate azurecli script is supposed to generate the output used for the readme file that describes the image spec the output should include the version number however i noticed that in the last i generated the version was not printed out see however i verified that in the generated image azure cli version was properly installed and reported by az version image impacted and since they share the same script
| 0
|
12,215
| 14,742,982,640
|
IssuesEvent
|
2021-01-07 13:13:15
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Winnipeg NCSM 1st billing
|
anc-process anp-emergency release anp-important ant-parent/primary ant-support has attachment
|
In GitLab by @kdjstudios on Jun 27, 2019, 09:08
**Submitted by:** "Michelle Mckee" <michelle.mckee@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-06-27-87158/conversation
**Server:** Internal
**Client/Site:** Winnipeg
**Account:** NA
**Issue:**
Somehow the NCSM 1st Cycle Accounts billing cycles for Winnipeg got messed up.
As you can see on the billing cycle download from SABilling below the billing cycle “12/29/2018 NCSM 1st Cycle Accounts” and the billing cycle “01/29/2019 NCSM 1st Cycle Accounts” have the same invoice date and the same service period start and end dates. If you look at the invoices from the 1/29/19 cycle, they are zero. The invoices from the 12/29 cycle show activity.
Then, the 3/28 billing cycle and the 5/28 billing cycle both have the same invoice date and the same service period start and end dates

Below is how they should have looked.

Is there anyway to get this all back on track? The 8/28 billing cycle having an invoice date of 9/1, service period of 9/1/19 – 9/30/19?
|
1.0
|
Winnipeg NCSM 1st billing - In GitLab by @kdjstudios on Jun 27, 2019, 09:08
**Submitted by:** "Michelle Mckee" <michelle.mckee@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-06-27-87158/conversation
**Server:** Internal
**Client/Site:** Winnipeg
**Account:** NA
**Issue:**
Somehow the NCSM 1st Cycle Accounts billing cycles for Winnipeg got messed up.
As you can see on the billing cycle download from SABilling below the billing cycle “12/29/2018 NCSM 1st Cycle Accounts” and the billing cycle “01/29/2019 NCSM 1st Cycle Accounts” have the same invoice date and the same service period start and end dates. If you look at the invoices from the 1/29/19 cycle, they are zero. The invoices from the 12/29 cycle show activity.
Then, the 3/28 billing cycle and the 5/28 billing cycle both have the same invoice date and the same service period start and end dates

Below is how they should have looked.

Is there anyway to get this all back on track? The 8/28 billing cycle having an invoice date of 9/1, service period of 9/1/19 – 9/30/19?
|
process
|
winnipeg ncsm billing in gitlab by kdjstudios on jun submitted by michelle mckee helpdesk server internal client site winnipeg account na issue somehow the ncsm cycle accounts billing cycles for winnipeg got messed up as you can see on the billing cycle download from sabilling below the billing cycle “ ncsm cycle accounts” and the billing cycle “ ncsm cycle accounts” have the same invoice date and the same service period start and end dates if you look at the invoices from the cycle they are zero the invoices from the cycle show activity then the billing cycle and the billing cycle both have the same invoice date and the same service period start and end dates uploads image png below is how they should have looked uploads image png is there anyway to get this all back on track the billing cycle having an invoice date of service period of –
| 1
|
16,879
| 22,157,546,084
|
IssuesEvent
|
2022-06-04 02:41:36
|
hashgraph/hedera-json-rpc-relay
|
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
|
closed
|
Add k6 performance test support for relay
|
enhancement P2 process
|
### Problem
The repo currently has UT's, some integration tests and acceptance tests.
However, it does not yet have easy to run performance tests.
### Solution
Success in usage was found with [k6 test suite](https://k6.io/)
Add support for k6 run
- Add a README noting run steps similar to https://github.com/hashgraph/hedera-mirror-node/tree/main/hedera-mirror-test/k6
- Add k6 test scripts similar to https://github.com/hashgraph/hedera-mirror-node/tree/main/hedera-mirror-test/k6/src/web3
- Add test cases for the heavily used endpoints.
Test coverage can be added in phases i.e. methods that don't call any nodes, methods that call mirror node, methods that call consensus nodes
### Alternatives
_No response_
|
1.0
|
Add k6 performance test support for relay - ### Problem
The repo currently has UT's, some integration tests and acceptance tests.
However, it does not yet have easy to run performance tests.
### Solution
Success in usage was found with [k6 test suite](https://k6.io/)
Add support for k6 run
- Add a README noting run steps similar to https://github.com/hashgraph/hedera-mirror-node/tree/main/hedera-mirror-test/k6
- Add k6 test scripts similar to https://github.com/hashgraph/hedera-mirror-node/tree/main/hedera-mirror-test/k6/src/web3
- Add test cases for the heavily used endpoints.
Test coverage can be added in phases i.e. methods that don't call any nodes, methods that call mirror node, methods that call consensus nodes
### Alternatives
_No response_
|
process
|
add performance test support for relay problem the repo currently has ut s some integration tests and acceptance tests however it does not yet have easy to run performance tests solution success in usage was found with add support for run add a readme noting run steps similar to add test scripts similar to add test cases for the heavily used endpoints test coverage can be added in phases i e methods that don t call any nodes methods that call mirror node methods that call consensus nodes alternatives no response
| 1
|
6,273
| 9,231,175,168
|
IssuesEvent
|
2019-03-13 01:08:50
|
EthVM/EthVM
|
https://api.github.com/repos/EthVM/EthVM
|
closed
|
Exception while processing blocks
|
bug project:processing
|
I started a new empty processing (every docker machine from empty state, everything up and running). Initially I obtained the exception in kafka-streams project. Upon inspection turned out that for some reason, the ethereumj client received a bad block that can't decode properly, so it dropped that "processed" block (maybe the error is worse than that).
Take a look and see if we can gather more insights (appears to happen randomly, maybe my network was not behaving as great this morning, not sure if is easy to replicate):
In kafka-streams:
```
06:32:24.212 ERROR o.a.k.s.p.i.AssignedStreamsTasks - stream-thread [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1] Failed to process stream task 0_0 due to the following error:
java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
06:32:24.212 ERROR o.a.k.s.p.internals.StreamThread - stream-thread [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1] Encountered the following error during processing:
java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
06:32:24.255 WARN o.apache.kafka.streams.KafkaStreams - stream-client [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934] All stream threads have died. The instance will be in error state and should be closed.
Exception in thread "block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1" java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
```
And in EthereumJ:
```
06:31:11.051 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Instantiated a transactional producer.
06:31:11.052 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled.
06:31:11.053 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Overriding the default acks to all since idempotence is enabled.
06:31:11.087 INFO [o.a.k.c.u.AppInfoParser] Kafka version : 2.0.1-cp1
06:31:11.087 INFO [o.a.k.c.u.AppInfoParser] Kafka commitId : c41567a0eb63796c
06:31:11.093 INFO [o.a.k.c.p.i.TransactionManager] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] ProducerId set to -1 with epoch -1
06:31:12.248 INFO [o.a.k.c.p.i.TransactionManager] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] ProducerId set to 0 with epoch 0
06:31:12.456 INFO [general] rocksdb key-value data source created: peers
06:31:12.466 INFO [general] External IP wasn't set, using checkip.amazonaws.com to identify it...
06:31:12.727 INFO [general] External address identified: 62.57.153.25
06:31:12.852 INFO [general] EthereumJ node started: enode://3ebe8690ed1008098eea32bbb0a27bceb83b38f2b3059c1e46acef48057e99712264af0c80628c24d5582345de224a7325dfc951600b02418ee3a49de92ded20@62.57.153.25:30303
06:31:12.889 INFO [general] DB is empty - adding Genesis
06:31:13.178 INFO [general] Genesis block loaded
06:31:13.246 INFO [general] Bind address wasn't set, Punching to identify it...
06:31:13.274 INFO [general] UDP local bound to: 192.168.232.136
06:31:13.288 INFO [discover] Discovery UDPListener started
06:31:13.551 INFO [net] Listening for incoming connections, port: [30303]
06:31:13.557 INFO [net] NodeId: [3ebe8690ed1008098eea32bbb0a27bceb83b38f2b3059c1e46acef48057e99712264af0c80628c24d5582345de224a7325dfc951600b02418ee3a49de92ded20]
06:31:13.682 INFO [discover] Reading Node statistics from DB: 0 nodes.
06:31:13.821 INFO [discover] Pinging discovery nodes...
06:31:14.186 INFO [discover] Received response.
06:31:14.240 WARN [blockchain] EDT task executed in more than 1 sec: 1061ms, Executor queue size: 28
06:31:14.250 INFO [ethash] Kept caches: cnt: 1 epochs: 0...0
06:31:14.334 INFO [o.a.k.c.Metadata] Cluster ID: RZStG7oBREeUyUMy6f-RmQ
06:31:14.954 INFO [discover] New peers discovered.
06:31:15.345 INFO [kafka-listener] Published 1 block(s)
06:31:23.284 INFO [net] TCP: Speed in/out 1779b / 2Kb(sec), packets in/out 81/93, total in/out: 17Kb / 23Kb
06:31:23.287 INFO [net] UDP: Speed in/out 3Kb / 2Kb(sec), packets in/out 140/192, total in/out: 30Kb / 26Kb
06:31:24.232 INFO [net] New peers processed: [c431ad9d | /66.63.190.186:30303], active peers added: 1, total active peers: 1
06:31:33.284 INFO [net] TCP: Speed in/out 14Kb / 3Kb(sec), packets in/out 164/128, total in/out: 166Kb / 55Kb
06:31:33.284 INFO [net] UDP: Speed in/out 1130b / 1095b(sec), packets in/out 45/78, total in/out: 41Kb / 36Kb
06:31:37.906 WARN [net] Can't decrypt AuthInitiateMessage from /127.0.0.1:54564. Most likely the remote peer used wrong public key (NodeID) to encrypt message.
06:31:43.284 INFO [net] TCP: Speed in/out 7Kb / 2Kb(sec), packets in/out 82/58, total in/out: 239Kb / 77Kb
06:31:43.284 INFO [net] UDP: Speed in/out 731b / 1090b(sec), packets in/out 34/68, total in/out: 48Kb / 47Kb
06:31:51.248 INFO [net] New peers processed: [58905431 | /128.199.83.11:30303], active peers added: 1, total active peers: 2
06:31:52.554 WARN [net] Can't decrypt AuthInitiateMessage from /127.0.0.1:54686. Most likely the remote peer used wrong public key (NodeID) to encrypt message.
06:31:53.284 INFO [net] TCP: Speed in/out 20Kb / 1579b(sec), packets in/out 117/65, total in/out: 448Kb / 93Kb
06:31:53.284 INFO [net] UDP: Speed in/out 2Kb / 2Kb(sec), packets in/out 91/162, total in/out: 72Kb / 69Kb
06:31:54.747 WARN [net] SnappyCodec failed:
io.netty.handler.codec.DecoderException: java.io.IOException: PARSING_ERROR(2)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
at org.ethereum.net.rlpx.NettyByteToMessageCodec.channelRead(NettyByteToMessageCodec.java:73)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.handlerRemoved(ByteToMessageDecoder.java:203)
at io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:527)
at io.netty.channel.DefaultChannelPipeline.callHandlerRemoved(DefaultChannelPipeline.java:521)
at io.netty.channel.DefaultChannelPipeline.remove0(DefaultChannelPipeline.java:351)
at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:322)
at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:299)
at org.ethereum.net.rlpx.HandshakeHandler.decode(HandshakeHandler.java:107)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at org.ethereum.net.server.WireTrafficStats$TrafficStatHandler.channelRead(WireTrafficStats.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:152)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:110)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: PARSING_ERROR(2)
at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:98)
at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:613)
at org.ethereum.net.rlpx.SnappyCodec.decode(SnappyCodec.java:80)
at org.ethereum.net.rlpx.SnappyCodec.decode(SnappyCodec.java:39)
at io.netty.handler.codec.MessageToMessageCodec$2.decode(MessageToMessageCodec.java:81)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
... 33 common frames omitted
06:31:55.871 INFO [blockchain] *** Last block added [ #100 ]
```
|
1.0
|
Exception while processing blocks - I started a new empty processing (every docker machine from empty state, everything up and running). Initially I obtained the exception in kafka-streams project. Upon inspection turned out that for some reason, the ethereumj client received a bad block that can't decode properly, so it dropped that "processed" block (maybe the error is worse than that).
Take a look and see if we can gather more insights (appears to happen randomly, maybe my network was not behaving as great this morning, not sure if is easy to replicate):
In kafka-streams:
```
06:32:24.212 ERROR o.a.k.s.p.i.AssignedStreamsTasks - stream-thread [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1] Failed to process stream task 0_0 due to the following error:
java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
06:32:24.212 ERROR o.a.k.s.p.internals.StreamThread - stream-thread [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1] Encountered the following error during processing:
java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
06:32:24.255 WARN o.apache.kafka.streams.KafkaStreams - stream-client [block-processor-3c83b05f-1d01-480e-afee-d5202e35a934] All stream threads have died. The instance will be in error state and should be closed.
Exception in thread "block-processor-3c83b05f-1d01-480e-afee-d5202e35a934-StreamThread-1" java.lang.IllegalStateException: Block out of sequence. Expected = 128, received = 129
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.ensureSequentialProcessing(ChainEventsTransformer.kt:99)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:84)
at io.enkrypt.kafka.streams.processors.block.ChainEventsTransformer.transform(ChainEventsTransformer.kt:24)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:132)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
```
And in EthereumJ:
```
06:31:11.051 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Instantiated a transactional producer.
06:31:11.052 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled.
06:31:11.053 INFO [o.a.k.c.p.KafkaProducer] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] Overriding the default acks to all since idempotence is enabled.
06:31:11.087 INFO [o.a.k.c.u.AppInfoParser] Kafka version : 2.0.1-cp1
06:31:11.087 INFO [o.a.k.c.u.AppInfoParser] Kafka commitId : c41567a0eb63796c
06:31:11.093 INFO [o.a.k.c.p.i.TransactionManager] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] ProducerId set to -1 with epoch -1
06:31:12.248 INFO [o.a.k.c.p.i.TransactionManager] [Producer clientId=ethereumj-block-summaries, transactionalId=ethereumj] ProducerId set to 0 with epoch 0
06:31:12.456 INFO [general] rocksdb key-value data source created: peers
06:31:12.466 INFO [general] External IP wasn't set, using checkip.amazonaws.com to identify it...
06:31:12.727 INFO [general] External address identified: 62.57.153.25
06:31:12.852 INFO [general] EthereumJ node started: enode://3ebe8690ed1008098eea32bbb0a27bceb83b38f2b3059c1e46acef48057e99712264af0c80628c24d5582345de224a7325dfc951600b02418ee3a49de92ded20@62.57.153.25:30303
06:31:12.889 INFO [general] DB is empty - adding Genesis
06:31:13.178 INFO [general] Genesis block loaded
06:31:13.246 INFO [general] Bind address wasn't set, Punching to identify it...
06:31:13.274 INFO [general] UDP local bound to: 192.168.232.136
06:31:13.288 INFO [discover] Discovery UDPListener started
06:31:13.551 INFO [net] Listening for incoming connections, port: [30303]
06:31:13.557 INFO [net] NodeId: [3ebe8690ed1008098eea32bbb0a27bceb83b38f2b3059c1e46acef48057e99712264af0c80628c24d5582345de224a7325dfc951600b02418ee3a49de92ded20]
06:31:13.682 INFO [discover] Reading Node statistics from DB: 0 nodes.
06:31:13.821 INFO [discover] Pinging discovery nodes...
06:31:14.186 INFO [discover] Received response.
06:31:14.240 WARN [blockchain] EDT task executed in more than 1 sec: 1061ms, Executor queue size: 28
06:31:14.250 INFO [ethash] Kept caches: cnt: 1 epochs: 0...0
06:31:14.334 INFO [o.a.k.c.Metadata] Cluster ID: RZStG7oBREeUyUMy6f-RmQ
06:31:14.954 INFO [discover] New peers discovered.
06:31:15.345 INFO [kafka-listener] Published 1 block(s)
06:31:23.284 INFO [net] TCP: Speed in/out 1779b / 2Kb(sec), packets in/out 81/93, total in/out: 17Kb / 23Kb
06:31:23.287 INFO [net] UDP: Speed in/out 3Kb / 2Kb(sec), packets in/out 140/192, total in/out: 30Kb / 26Kb
06:31:24.232 INFO [net] New peers processed: [c431ad9d | /66.63.190.186:30303], active peers added: 1, total active peers: 1
06:31:33.284 INFO [net] TCP: Speed in/out 14Kb / 3Kb(sec), packets in/out 164/128, total in/out: 166Kb / 55Kb
06:31:33.284 INFO [net] UDP: Speed in/out 1130b / 1095b(sec), packets in/out 45/78, total in/out: 41Kb / 36Kb
06:31:37.906 WARN [net] Can't decrypt AuthInitiateMessage from /127.0.0.1:54564. Most likely the remote peer used wrong public key (NodeID) to encrypt message.
06:31:43.284 INFO [net] TCP: Speed in/out 7Kb / 2Kb(sec), packets in/out 82/58, total in/out: 239Kb / 77Kb
06:31:43.284 INFO [net] UDP: Speed in/out 731b / 1090b(sec), packets in/out 34/68, total in/out: 48Kb / 47Kb
06:31:51.248 INFO [net] New peers processed: [58905431 | /128.199.83.11:30303], active peers added: 1, total active peers: 2
06:31:52.554 WARN [net] Can't decrypt AuthInitiateMessage from /127.0.0.1:54686. Most likely the remote peer used wrong public key (NodeID) to encrypt message.
06:31:53.284 INFO [net] TCP: Speed in/out 20Kb / 1579b(sec), packets in/out 117/65, total in/out: 448Kb / 93Kb
06:31:53.284 INFO [net] UDP: Speed in/out 2Kb / 2Kb(sec), packets in/out 91/162, total in/out: 72Kb / 69Kb
06:31:54.747 WARN [net] SnappyCodec failed:
io.netty.handler.codec.DecoderException: java.io.IOException: PARSING_ERROR(2)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
at org.ethereum.net.rlpx.NettyByteToMessageCodec.channelRead(NettyByteToMessageCodec.java:73)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.handlerRemoved(ByteToMessageDecoder.java:203)
at io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:527)
at io.netty.channel.DefaultChannelPipeline.callHandlerRemoved(DefaultChannelPipeline.java:521)
at io.netty.channel.DefaultChannelPipeline.remove0(DefaultChannelPipeline.java:351)
at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:322)
at io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:299)
at org.ethereum.net.rlpx.HandshakeHandler.decode(HandshakeHandler.java:107)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at org.ethereum.net.server.WireTrafficStats$TrafficStatHandler.channelRead(WireTrafficStats.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:152)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:110)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: PARSING_ERROR(2)
at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:98)
at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:613)
at org.ethereum.net.rlpx.SnappyCodec.decode(SnappyCodec.java:80)
at org.ethereum.net.rlpx.SnappyCodec.decode(SnappyCodec.java:39)
at io.netty.handler.codec.MessageToMessageCodec$2.decode(MessageToMessageCodec.java:81)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
... 33 common frames omitted
06:31:55.871 INFO [blockchain] *** Last block added [ #100 ]
```
|
process
|
exception while processing blocks i started a new empty processing every docker machine from empty state everything up and running initially i obtained the exception in kafka streams project upon inspection turned out that for some reason the ethereumj client received a bad block that can t decode properly so it dropped that processed block maybe the error is worse than that take a look and see if we can gather more insights appears to happen randomly maybe my network was not behaving as great this morning not sure if is easy to replicate in kafka streams error o a k s p i assignedstreamstasks stream thread failed to process stream task due to the following error java lang illegalstateexception block out of sequence expected received at io enkrypt kafka streams processors block chaineventstransformer ensuresequentialprocessing chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at org apache kafka streams kstream internals kstreamtransform kstreamtransformprocessor process kstreamtransform java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams kstream internals kstreampeek kstreampeekprocessor process kstreampeek java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals sourcenode process sourcenode java at org apache kafka streams processor internals streamtask process streamtask java at org apache kafka streams processor internals assignedstreamstasks process assignedstreamstasks java at org apache kafka streams processor internals taskmanager process taskmanager java at org apache kafka streams processor internals streamthread runonce streamthread java at org apache kafka streams processor internals streamthread runloop streamthread java at org apache kafka streams processor internals streamthread run streamthread java error o a k s p internals streamthread stream thread encountered the following error during processing java lang illegalstateexception block out of sequence expected received at io enkrypt kafka streams processors block chaineventstransformer ensuresequentialprocessing chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at org apache kafka streams kstream internals kstreamtransform kstreamtransformprocessor process kstreamtransform java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams kstream internals kstreampeek kstreampeekprocessor process kstreampeek java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals sourcenode process sourcenode java at org apache kafka streams processor internals streamtask process streamtask java at org apache kafka streams processor internals assignedstreamstasks process assignedstreamstasks java at org apache kafka streams processor internals taskmanager process taskmanager java at org apache kafka streams processor internals streamthread runonce streamthread java at org apache kafka streams processor internals streamthread runloop streamthread java at org apache kafka streams processor internals streamthread run streamthread java warn o apache kafka streams kafkastreams stream client all stream threads have died the instance will be in error state and should be closed exception in thread block processor afee streamthread java lang illegalstateexception block out of sequence expected received at io enkrypt kafka streams processors block chaineventstransformer ensuresequentialprocessing chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at io enkrypt kafka streams processors block chaineventstransformer transform chaineventstransformer kt at org apache kafka streams kstream internals kstreamtransform kstreamtransformprocessor process kstreamtransform java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams kstream internals kstreampeek kstreampeekprocessor process kstreampeek java at org apache kafka streams processor internals processornode process processornode java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals processorcontextimpl forward processorcontextimpl java at org apache kafka streams processor internals sourcenode process sourcenode java at org apache kafka streams processor internals streamtask process streamtask java at org apache kafka streams processor internals assignedstreamstasks process assignedstreamstasks java at org apache kafka streams processor internals taskmanager process taskmanager java at org apache kafka streams processor internals streamthread runonce streamthread java at org apache kafka streams processor internals streamthread runloop streamthread java at org apache kafka streams processor internals streamthread run streamthread java and in ethereumj info instantiated a transactional producer info overriding the default retries config to the recommended value of since the idempotent producer is enabled info overriding the default acks to all since idempotence is enabled info kafka version info kafka commitid info producerid set to with epoch info producerid set to with epoch info rocksdb key value data source created peers info external ip wasn t set using checkip amazonaws com to identify it info external address identified info ethereumj node started enode info db is empty adding genesis info genesis block loaded info bind address wasn t set punching to identify it info udp local bound to info discovery udplistener started info listening for incoming connections port info nodeid info reading node statistics from db nodes info pinging discovery nodes info received response warn edt task executed in more than sec executor queue size info kept caches cnt epochs info cluster id rmq info new peers discovered info published block s info tcp speed in out sec packets in out total in out info udp speed in out sec packets in out total in out info new peers processed active peers added total active peers info tcp speed in out sec packets in out total in out info udp speed in out sec packets in out total in out warn can t decrypt authinitiatemessage from most likely the remote peer used wrong public key nodeid to encrypt message info tcp speed in out sec packets in out total in out info udp speed in out sec packets in out total in out info new peers processed active peers added total active peers warn can t decrypt authinitiatemessage from most likely the remote peer used wrong public key nodeid to encrypt message info tcp speed in out sec packets in out total in out info udp speed in out sec packets in out total in out warn snappycodec failed io netty handler codec decoderexception java io ioexception parsing error at io netty handler codec messagetomessagedecoder channelread messagetomessagedecoder java at io netty handler codec messagetomessagecodec channelread messagetomessagecodec java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler codec bytetomessagedecoder channelread bytetomessagedecoder java at org ethereum net rlpx nettybytetomessagecodec channelread nettybytetomessagecodec java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler codec bytetomessagedecoder handlerremoved bytetomessagedecoder java at io netty channel defaultchannelpipeline defaultchannelpipeline java at io netty channel defaultchannelpipeline callhandlerremoved defaultchannelpipeline java at io netty channel defaultchannelpipeline defaultchannelpipeline java at io netty channel defaultchannelpipeline remove defaultchannelpipeline java at io netty channel defaultchannelpipeline remove defaultchannelpipeline java at org ethereum net rlpx handshakehandler decode handshakehandler java at io netty handler codec bytetomessagedecoder calldecode bytetomessagedecoder java at io netty handler codec bytetomessagedecoder channelread bytetomessagedecoder java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel channelinboundhandleradapter channelread channelinboundhandleradapter java at org ethereum net server wiretrafficstats trafficstathandler channelread wiretrafficstats java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler timeout readtimeouthandler channelread readtimeouthandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline firechannelread defaultchannelpipeline java at io netty channel nio abstractniobytechannel niobyteunsafe read abstractniobytechannel java at io netty channel nio nioeventloop processselectedkey nioeventloop java at io netty channel nio nioeventloop processselectedkeysoptimized nioeventloop java at io netty channel nio nioeventloop processselectedkeys nioeventloop java at io netty channel nio nioeventloop run nioeventloop java at io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at java lang thread run thread java caused by java io ioexception parsing error at org xerial snappy snappynative throw error snappynative java at org xerial snappy snappynative uncompressedlength native method at org xerial snappy snappy uncompressedlength snappy java at org ethereum net rlpx snappycodec decode snappycodec java at org ethereum net rlpx snappycodec decode snappycodec java at io netty handler codec messagetomessagecodec decode messagetomessagecodec java at io netty handler codec messagetomessagedecoder channelread messagetomessagedecoder java common frames omitted info last block added
| 1
|
12,441
| 14,933,263,769
|
IssuesEvent
|
2021-01-25 08:59:16
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
Password criteria display should be consistent
|
Bug P2 Participant manager Process: Tested dev
|
Password criteria should be consistent
ER : ' Your password must be at least 8 characters long and contain lower case, upper case, numeric and special characters' should be displayed in the following screens
1. Change Password screen
2. Setup password screen

|
1.0
|
Password criteria display should be consistent - Password criteria should be consistent
ER : ' Your password must be at least 8 characters long and contain lower case, upper case, numeric and special characters' should be displayed in the following screens
1. Change Password screen
2. Setup password screen

|
process
|
password criteria display should be consistent password criteria should be consistent er your password must be at least characters long and contain lower case upper case numeric and special characters should be displayed in the following screens change password screen setup password screen
| 1
|
35,943
| 7,835,010,304
|
IssuesEvent
|
2018-06-16 21:36:04
|
bigbluebutton/bigbluebutton
|
https://api.github.com/repos/bigbluebutton/bigbluebutton
|
closed
|
passing invalid configToken on join throws grails 500
|
API Accepted Defect FirstProject Fixed Web
|

Current 1.0Beta, if you send a join with configToken, that is invalid you get 500 error, and not a valid response that does not exist.
I did this with /var/bigbluebutton/configs, configToken= test.xml and also with a random made up config number.
Steps.
1) create a meeting
http://test-install.blindsidenetworks.com/bigbluebutton/api/create?allowStartStopRecording=true&attendeePW=ap&autoStartRecording=false&meetingID=random-7705890&moderatorPW=mp&name=random-7705890&record=false&voiceBridge=78188&welcome=%3Cbr%3EWelcome+to+%3Cb%3E%25%25CONFNAME%25%25%3C%2Fb%3E%21&checksum=8cc8e32d2cbfc65bc7d065a75ea17ef366a3aace
2) Pass join with invalid config
http://test-install.blindsidenetworks.com/bigbluebutton/api/join?configToken=testme.xml&fullName=User+2191794&meetingID=random-7705890&password=mp&redirect=true&checksum=729ec0bfc323a6b7e85a31923ebd31dc2f3f32a6
|
1.0
|
passing invalid configToken on join throws grails 500 - 
Current 1.0Beta, if you send a join with configToken, that is invalid you get 500 error, and not a valid response that does not exist.
I did this with /var/bigbluebutton/configs, configToken= test.xml and also with a random made up config number.
Steps.
1) create a meeting
http://test-install.blindsidenetworks.com/bigbluebutton/api/create?allowStartStopRecording=true&attendeePW=ap&autoStartRecording=false&meetingID=random-7705890&moderatorPW=mp&name=random-7705890&record=false&voiceBridge=78188&welcome=%3Cbr%3EWelcome+to+%3Cb%3E%25%25CONFNAME%25%25%3C%2Fb%3E%21&checksum=8cc8e32d2cbfc65bc7d065a75ea17ef366a3aace
2) Pass join with invalid config
http://test-install.blindsidenetworks.com/bigbluebutton/api/join?configToken=testme.xml&fullName=User+2191794&meetingID=random-7705890&password=mp&redirect=true&checksum=729ec0bfc323a6b7e85a31923ebd31dc2f3f32a6
|
non_process
|
passing invalid configtoken on join throws grails current if you send a join with configtoken that is invalid you get error and not a valid response that does not exist i did this with var bigbluebutton configs configtoken test xml and also with a random made up config number steps create a meeting pass join with invalid config
| 0
|
101,192
| 21,627,554,331
|
IssuesEvent
|
2022-05-05 05:38:18
|
appsmithorg/appsmith
|
https://api.github.com/repos/appsmithorg/appsmith
|
closed
|
[Bug]-[1392]:Curl import corrupts data
|
Bug High Needs Triaging BE Coders Pod REST API plugin Actions Pod
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Description
Curl import corrupted
### Steps To Reproduce
Try importing curl for
```
curl -X POST https://api.eu.sparkpost.com/api/v1/transmissions -H 'Authorization: <APIKEY>' -H 'Content-Type: application/json' -d '{
"options":{
"open_tracking":false,
"click_tracking":false,
"inline_css":false
},
"recipients":[
{
"address":{
"email":"user@domain.tld",
"name":"user"
}
}
],
"content":{
"from":{
"name":"sender",
"email":"sender@domain.tld"
},
"reply_to":"replyto@domain.tld",
"subject":"subject",
"text":"textbody",
"attachments":[
{
"name":"attachmentname.pdf",
"type":"application/pdf",
"data":"'$(cat test.pdf | base64 --wrap=0)'"
}
]
}
}'
```
### Public Sample App
_No response_
### Version
self hosted
|
1.0
|
[Bug]-[1392]:Curl import corrupts data - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Description
Curl import corrupted
### Steps To Reproduce
Try importing curl for
```
curl -X POST https://api.eu.sparkpost.com/api/v1/transmissions -H 'Authorization: <APIKEY>' -H 'Content-Type: application/json' -d '{
"options":{
"open_tracking":false,
"click_tracking":false,
"inline_css":false
},
"recipients":[
{
"address":{
"email":"user@domain.tld",
"name":"user"
}
}
],
"content":{
"from":{
"name":"sender",
"email":"sender@domain.tld"
},
"reply_to":"replyto@domain.tld",
"subject":"subject",
"text":"textbody",
"attachments":[
{
"name":"attachmentname.pdf",
"type":"application/pdf",
"data":"'$(cat test.pdf | base64 --wrap=0)'"
}
]
}
}'
```
### Public Sample App
_No response_
### Version
self hosted
|
non_process
|
curl import corrupts data is there an existing issue for this i have searched the existing issues description curl import corrupted steps to reproduce try importing curl for curl x post h authorization h content type application json d options open tracking false click tracking false inline css false recipients address email user domain tld name user content from name sender email sender domain tld reply to replyto domain tld subject subject text textbody attachments name attachmentname pdf type application pdf data cat test pdf wrap public sample app no response version self hosted
| 0
|
20,188
| 10,650,945,673
|
IssuesEvent
|
2019-10-17 09:21:47
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
when setting [initialScrollOffset], ListView unnecessarily rendering
|
f: scrolling framework severe: performance waiting for customer response
|
## Steps to Reproduce
```dart
@override
Widget build(BuildContext context) {
double height = MediaQuery.of(context).size.height;
ScrollController scrollController = ScrollController(
initialScrollOffset: height*100,
keepScrollOffset: false);
ListView listView = ListView.builder(
controller: scrollController,
reverse: false,
itemCount: 99999,
scrollDirection: Axis.vertical,
itemBuilder: (BuildContext context, int index) {
debugPrint("building: $index");
return SizedBox.fromSize(
child: Text("$index"),
size: Size(MediaQuery.of(context).size.width, height),
);
},
);
return Scaffold(
floatingActionButton: RaisedButton(
onPressed: () {
// listView.
scrollController.jumpTo(
100 * height
);
},
),
body: listView,
);
}
```
Here's my code. I created a ListView with initialScrollOffset. It suppost to be only render the 100th item. But it rendered all the 100 items.
|
True
|
when setting [initialScrollOffset], ListView unnecessarily rendering - ## Steps to Reproduce
```dart
@override
Widget build(BuildContext context) {
double height = MediaQuery.of(context).size.height;
ScrollController scrollController = ScrollController(
initialScrollOffset: height*100,
keepScrollOffset: false);
ListView listView = ListView.builder(
controller: scrollController,
reverse: false,
itemCount: 99999,
scrollDirection: Axis.vertical,
itemBuilder: (BuildContext context, int index) {
debugPrint("building: $index");
return SizedBox.fromSize(
child: Text("$index"),
size: Size(MediaQuery.of(context).size.width, height),
);
},
);
return Scaffold(
floatingActionButton: RaisedButton(
onPressed: () {
// listView.
scrollController.jumpTo(
100 * height
);
},
),
body: listView,
);
}
```
Here's my code. I created a ListView with initialScrollOffset. It suppost to be only render the 100th item. But it rendered all the 100 items.
|
non_process
|
when setting listview unnecessarily rendering steps to reproduce dart override widget build buildcontext context double height mediaquery of context size height scrollcontroller scrollcontroller scrollcontroller initialscrolloffset height keepscrolloffset false listview listview listview builder controller scrollcontroller reverse false itemcount scrolldirection axis vertical itembuilder buildcontext context int index debugprint building index return sizedbox fromsize child text index size size mediaquery of context size width height return scaffold floatingactionbutton raisedbutton onpressed listview scrollcontroller jumpto height body listview here s my code i created a listview with initialscrolloffset it suppost to be only render the item but it rendered all the items
| 0
|
610,763
| 18,923,641,667
|
IssuesEvent
|
2021-11-17 06:45:55
|
yamamoto-yuta/slack-deck
|
https://api.github.com/repos/yamamoto-yuta/slack-deck
|
opened
|
✨ `https://<WORKSPACE_NAME>.slack.com/archives/*` というURLからでもカラムを追加できるようにする
|
enhancement Priority: HIGH
|
ToDo:
- Config画面にワークスペース登録の項目を追加
- カラム追加時にワークスペースを指定して追加する項目を追加
|
1.0
|
✨ `https://<WORKSPACE_NAME>.slack.com/archives/*` というURLからでもカラムを追加できるようにする - ToDo:
- Config画面にワークスペース登録の項目を追加
- カラム追加時にワークスペースを指定して追加する項目を追加
|
non_process
|
✨ というurlからでもカラムを追加できるようにする todo config画面にワークスペース登録の項目を追加 カラム追加時にワークスペースを指定して追加する項目を追加
| 0
|
105,587
| 11,454,979,017
|
IssuesEvent
|
2020-02-06 18:09:02
|
MichelRahme/ConcordiaNavigation
|
https://api.github.com/repos/MichelRahme/ConcordiaNavigation
|
closed
|
8. Testing Plan
|
documentation
|
1. Unit testing
2. Integration testing
3. System testing
4. Acceptance testing
5. Testing and Results
5.1 Compare advantages/disadvantages of different testing frameworks
5.2 Include screenshots of successful installation/execution of testing
5.3 Include tools that will be used for computing test coverage
|
1.0
|
8. Testing Plan - 1. Unit testing
2. Integration testing
3. System testing
4. Acceptance testing
5. Testing and Results
5.1 Compare advantages/disadvantages of different testing frameworks
5.2 Include screenshots of successful installation/execution of testing
5.3 Include tools that will be used for computing test coverage
|
non_process
|
testing plan unit testing integration testing system testing acceptance testing testing and results compare advantages disadvantages of different testing frameworks include screenshots of successful installation execution of testing include tools that will be used for computing test coverage
| 0
|
3,640
| 6,676,375,102
|
IssuesEvent
|
2017-10-05 05:11:30
|
peterwebster/henson
|
https://api.github.com/repos/peterwebster/henson
|
reopened
|
Table-like formatting in journals
|
process refinement
|
In XML there is the possibility of rendering certain things as tables. Although inelegant, it may be that there are table-like things in the journals for which we should consider this. Could either @dpl0js (or Katie or Hilary as they come across them) let me know of examples that I may consider?
|
1.0
|
Table-like formatting in journals - In XML there is the possibility of rendering certain things as tables. Although inelegant, it may be that there are table-like things in the journals for which we should consider this. Could either @dpl0js (or Katie or Hilary as they come across them) let me know of examples that I may consider?
|
process
|
table like formatting in journals in xml there is the possibility of rendering certain things as tables although inelegant it may be that there are table like things in the journals for which we should consider this could either or katie or hilary as they come across them let me know of examples that i may consider
| 1
|
14,767
| 18,045,776,495
|
IssuesEvent
|
2021-09-18 21:46:47
|
Leviatan-Analytics/LA-data-processing
|
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
|
closed
|
Implement last analyses list endpoint [3]
|
Data Processing Week 3 Sprint 4
|
Implement backend endpoint to get a list of all previous analyses.
|
1.0
|
Implement last analyses list endpoint [3] - Implement backend endpoint to get a list of all previous analyses.
|
process
|
implement last analyses list endpoint implement backend endpoint to get a list of all previous analyses
| 1
|
12,291
| 14,850,519,093
|
IssuesEvent
|
2021-01-18 04:44:09
|
yuta252/startlens_web_backend
|
https://api.github.com/repos/yuta252/startlens_web_backend
|
closed
|
ユーザーのお気に入り一覧画面の実装
|
dev process
|
## 概要
ユーザーがお気に入り登録した観光地一覧を見ることができるようにユーザーごとの観光地一覧をレスポンスで返すAPIエンドポイントを作成
## 変更点
---
- [x] favoritesControllerにindexアクションを作成しユーザーごとのお気に入り一覧をレスポンスで返す
- [x] authenticableコンサーンを修正。ログインしていないユーザーはAuthorizationにJWTトークンを設定できないため、JWT docodeできないバグを修正
- [x] user modelにおいて、ユーザーID(tourist_id)でお気に入りリストを絞り込めるように検索機能を追加
- [x] Rspecテストを追加
|
1.0
|
ユーザーのお気に入り一覧画面の実装 - ## 概要
ユーザーがお気に入り登録した観光地一覧を見ることができるようにユーザーごとの観光地一覧をレスポンスで返すAPIエンドポイントを作成
## 変更点
---
- [x] favoritesControllerにindexアクションを作成しユーザーごとのお気に入り一覧をレスポンスで返す
- [x] authenticableコンサーンを修正。ログインしていないユーザーはAuthorizationにJWTトークンを設定できないため、JWT docodeできないバグを修正
- [x] user modelにおいて、ユーザーID(tourist_id)でお気に入りリストを絞り込めるように検索機能を追加
- [x] Rspecテストを追加
|
process
|
ユーザーのお気に入り一覧画面の実装 概要 ユーザーがお気に入り登録した観光地一覧を見ることができるようにユーザーごとの観光地一覧をレスポンスで返すapiエンドポイントを作成 変更点 favoritescontrollerにindexアクションを作成しユーザーごとのお気に入り一覧をレスポンスで返す authenticableコンサーンを修正。ログインしていないユーザーはauthorizationにjwtトークンを設定できないため、jwt docodeできないバグを修正 user modelにおいて、ユーザーid tourist id でお気に入りリストを絞り込めるように検索機能を追加 rspecテストを追加
| 1
|
2,203
| 5,047,262,316
|
IssuesEvent
|
2016-12-20 08:47:11
|
hbz/lobid-resources
|
https://api.github.com/repos/hbz/lobid-resources
|
reopened
|
Don't use relators as property but indicate relator via "role" property
|
API 2.0 processing
|
As discussed in #10 , we will indicate the roles of a contributor in the contributor object with the "role" property and won't use relators as properties.
Example:
``` json
{
"@context": {
"id": "@id",
"type": "@type",
"role": "http://bibframe.org/vocab/relator",
"contributor": {
"@type": "@id",
"@id": "http://purl.org/dc/terms/contributor",
"@container": "@set"
},
"label": "http://www.w3.org/2000/01/rdf-schema#label",
"altLabel": "http://www.w3.org/2004/02/skos/core#altLabel",
"subject": {
"@type": "@id",
"@id": "http://purl.org/dc/terms/subject",
"@container": "@list"
}
},
"@id" : "http://lobid.org/resources/HT018843259#!",
"contributor": [ {
"id": "http://d-nb.info/gnd/118548018",
"type": "DifferentiatedPerson",
"role": {
"id": "http://id.loc.gov/vocabulary/relators/cre",
"label": "Autor/in"
},
"label": "Becker, Thomas Paul",
"altLabel": [ "Becker, Thomas P." ]
} ],
"subject": [ {
"id" : "http://d-nb.info/gnd/4031485-6",
"type": "PlaceOrGeographicName",
"label": "Erzstift Köln",
"altLabel": [ "Kölner Krieg", "Truchsessischer Krieg" ]
} ]
}
```
We won't be able to only use a string as value for "role" as proposed in #10 but will – as you can see in the example – also use the [side car approach](https://github.com/hbz/lobid-rdf-to-json/issues/30) here – giving us presentation labels for the different roles in the data.
|
1.0
|
Don't use relators as property but indicate relator via "role" property - As discussed in #10 , we will indicate the roles of a contributor in the contributor object with the "role" property and won't use relators as properties.
Example:
``` json
{
"@context": {
"id": "@id",
"type": "@type",
"role": "http://bibframe.org/vocab/relator",
"contributor": {
"@type": "@id",
"@id": "http://purl.org/dc/terms/contributor",
"@container": "@set"
},
"label": "http://www.w3.org/2000/01/rdf-schema#label",
"altLabel": "http://www.w3.org/2004/02/skos/core#altLabel",
"subject": {
"@type": "@id",
"@id": "http://purl.org/dc/terms/subject",
"@container": "@list"
}
},
"@id" : "http://lobid.org/resources/HT018843259#!",
"contributor": [ {
"id": "http://d-nb.info/gnd/118548018",
"type": "DifferentiatedPerson",
"role": {
"id": "http://id.loc.gov/vocabulary/relators/cre",
"label": "Autor/in"
},
"label": "Becker, Thomas Paul",
"altLabel": [ "Becker, Thomas P." ]
} ],
"subject": [ {
"id" : "http://d-nb.info/gnd/4031485-6",
"type": "PlaceOrGeographicName",
"label": "Erzstift Köln",
"altLabel": [ "Kölner Krieg", "Truchsessischer Krieg" ]
} ]
}
```
We won't be able to only use a string as value for "role" as proposed in #10 but will – as you can see in the example – also use the [side car approach](https://github.com/hbz/lobid-rdf-to-json/issues/30) here – giving us presentation labels for the different roles in the data.
|
process
|
don t use relators as property but indicate relator via role property as discussed in we will indicate the roles of a contributor in the contributor object with the role property and won t use relators as properties example json context id id type type role contributor type id id container set label altlabel subject type id id container list id contributor id type differentiatedperson role id label autor in label becker thomas paul altlabel subject id type placeorgeographicname label erzstift köln altlabel we won t be able to only use a string as value for role as proposed in but will – as you can see in the example – also use the here – giving us presentation labels for the different roles in the data
| 1
|
237,648
| 7,762,706,422
|
IssuesEvent
|
2018-06-01 14:20:34
|
codenameone/CodenameOne
|
https://api.github.com/repos/codenameone/CodenameOne
|
closed
|
[BUG] callback of Capture.capturePhoto(callback) on Android is not invoked the first time
|
Priority-High
|
The following bug can be reproduced on **Android** 7 only after the first capturing of a photo after that the app is just installed. To reproduce the bug again, it's necessary to remove the app and install it again **(if you reinstall the app without removing it, or if you kill and restart the app, the bug cannot be reproduced)**. After the first taken photo, the bug disappears (until you will remove and install the app again).
This bug cannot be reproduced in the Simulator.
I report code and logs. However the same issue can be reproduced with the code published here (_on Android, the first taken image after the app install is not placed in the Label):_ https://www.codenameone.com/blog/round-at-codemotion.html
Code:
```
// Avatar default picture
int avatarSizeMM = 10; // the avatar size in mm
Style s = new Style();
s.setFgColor(ColorUtil.LTGRAY);
Label avatar = new Label(createAvatar(FontImage.createMaterial(FontImage.MATERIAL_PERSON, s, avatarSizeMM)), "Avatar");
// Callback after taking a photo (from photocamera or file chooser)
ActionListener callback = e -> {
Log.p("ActionListener callback invoked");
if (e != null && e.getSource() != null) {
String filePath = (String) e.getSource();
Image capturedImage = null;
Log.p("Capured image filePath: " + filePath);
if (filePath != null) {
try {
FileSystemStorage fs = FileSystemStorage.getInstance();
InputStream fis = fs.openInputStream(filePath);
capturedImage = Image.createImage(fis);
} catch (IOException ex) {
Log.p("IOException in loading the image: " + filePath);
Log.e(ex);
}
if (capturedImage != null) {
createAvatar(capturedImage, avatarSizeMM, avatar);
}
} else {
Log.p("Error: \"filePath = (String) e.getSource()\" is null");
}
}
Log.p("The image capture was canceled by the user (e.getSource() is null)");
};
// Button to take a photo from photocamera
Button camera = new Button("Fotocamera", FontImage.createMaterial(FontImage.MATERIAL_CAMERA, "Avatar", 5));
camera.addActionListener((e) -> {
Log.p("The native photocamera is going to be opened");
Capture.capturePhoto(callback);
});
```
After the first taken photo, the callback is not invoked. The log is only:
`[EDT] 0:0:39,881 - The native photocamera is going to be opened`
After the second taken photo, the log is _(note that the last line of log doesn't make sense, I didn't cancelled the image capture)_ :
```
[EDT] 0:0:39,881 - The native photocamera is going to be opened
[EDT] 0:7:54,190 - The native photocamera is going to be opened
[EDT] 0:7:59,700 - ActionListener callback invoked
[EDT] 0:7:59,700 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204014.jpg
[EDT] 0:8:0,45 - The image capture was cancelled by the user (e.getSource() is null)
```
After the third taken photo, the log is:
```
[EDT] 0:0:39,881 - The native photocamera is going to be opened
[EDT] 0:7:54,190 - The native photocamera is going to be opened
[EDT] 0:7:59,700 - ActionListener callback invoked
[EDT] 0:7:59,700 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204014.jpg
[EDT] 0:8:0,45 - The image capture was canceled by the user (e.getSource() is null)
[EDT] 0:15:2,647 - The native photocamera is going to be opened
[EDT] 0:15:13,567 - ActionListener callback invoked
[EDT] 0:15:13,568 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204723.jpg
[EDT] 0:15:13,840 - The image capture was canceled by the user (e.getSource() is null)
```
|
1.0
|
[BUG] callback of Capture.capturePhoto(callback) on Android is not invoked the first time - The following bug can be reproduced on **Android** 7 only after the first capturing of a photo after that the app is just installed. To reproduce the bug again, it's necessary to remove the app and install it again **(if you reinstall the app without removing it, or if you kill and restart the app, the bug cannot be reproduced)**. After the first taken photo, the bug disappears (until you will remove and install the app again).
This bug cannot be reproduced in the Simulator.
I report code and logs. However the same issue can be reproduced with the code published here (_on Android, the first taken image after the app install is not placed in the Label):_ https://www.codenameone.com/blog/round-at-codemotion.html
Code:
```
// Avatar default picture
int avatarSizeMM = 10; // the avatar size in mm
Style s = new Style();
s.setFgColor(ColorUtil.LTGRAY);
Label avatar = new Label(createAvatar(FontImage.createMaterial(FontImage.MATERIAL_PERSON, s, avatarSizeMM)), "Avatar");
// Callback after taking a photo (from photocamera or file chooser)
ActionListener callback = e -> {
Log.p("ActionListener callback invoked");
if (e != null && e.getSource() != null) {
String filePath = (String) e.getSource();
Image capturedImage = null;
Log.p("Capured image filePath: " + filePath);
if (filePath != null) {
try {
FileSystemStorage fs = FileSystemStorage.getInstance();
InputStream fis = fs.openInputStream(filePath);
capturedImage = Image.createImage(fis);
} catch (IOException ex) {
Log.p("IOException in loading the image: " + filePath);
Log.e(ex);
}
if (capturedImage != null) {
createAvatar(capturedImage, avatarSizeMM, avatar);
}
} else {
Log.p("Error: \"filePath = (String) e.getSource()\" is null");
}
}
Log.p("The image capture was canceled by the user (e.getSource() is null)");
};
// Button to take a photo from photocamera
Button camera = new Button("Fotocamera", FontImage.createMaterial(FontImage.MATERIAL_CAMERA, "Avatar", 5));
camera.addActionListener((e) -> {
Log.p("The native photocamera is going to be opened");
Capture.capturePhoto(callback);
});
```
After the first taken photo, the callback is not invoked. The log is only:
`[EDT] 0:0:39,881 - The native photocamera is going to be opened`
After the second taken photo, the log is _(note that the last line of log doesn't make sense, I didn't cancelled the image capture)_ :
```
[EDT] 0:0:39,881 - The native photocamera is going to be opened
[EDT] 0:7:54,190 - The native photocamera is going to be opened
[EDT] 0:7:59,700 - ActionListener callback invoked
[EDT] 0:7:59,700 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204014.jpg
[EDT] 0:8:0,45 - The image capture was cancelled by the user (e.getSource() is null)
```
After the third taken photo, the log is:
```
[EDT] 0:0:39,881 - The native photocamera is going to be opened
[EDT] 0:7:54,190 - The native photocamera is going to be opened
[EDT] 0:7:59,700 - ActionListener callback invoked
[EDT] 0:7:59,700 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204014.jpg
[EDT] 0:8:0,45 - The image capture was canceled by the user (e.getSource() is null)
[EDT] 0:15:2,647 - The native photocamera is going to be opened
[EDT] 0:15:13,567 - ActionListener callback invoked
[EDT] 0:15:13,568 - Capured image filePath: /storage/emulated/0/Pictures/Registrazione/IMG_20180509_204723.jpg
[EDT] 0:15:13,840 - The image capture was canceled by the user (e.getSource() is null)
```
|
non_process
|
callback of capture capturephoto callback on android is not invoked the first time the following bug can be reproduced on android only after the first capturing of a photo after that the app is just installed to reproduce the bug again it s necessary to remove the app and install it again if you reinstall the app without removing it or if you kill and restart the app the bug cannot be reproduced after the first taken photo the bug disappears until you will remove and install the app again this bug cannot be reproduced in the simulator i report code and logs however the same issue can be reproduced with the code published here on android the first taken image after the app install is not placed in the label code avatar default picture int avatarsizemm the avatar size in mm style s new style s setfgcolor colorutil ltgray label avatar new label createavatar fontimage creatematerial fontimage material person s avatarsizemm avatar callback after taking a photo from photocamera or file chooser actionlistener callback e log p actionlistener callback invoked if e null e getsource null string filepath string e getsource image capturedimage null log p capured image filepath filepath if filepath null try filesystemstorage fs filesystemstorage getinstance inputstream fis fs openinputstream filepath capturedimage image createimage fis catch ioexception ex log p ioexception in loading the image filepath log e ex if capturedimage null createavatar capturedimage avatarsizemm avatar else log p error filepath string e getsource is null log p the image capture was canceled by the user e getsource is null button to take a photo from photocamera button camera new button fotocamera fontimage creatematerial fontimage material camera avatar camera addactionlistener e log p the native photocamera is going to be opened capture capturephoto callback after the first taken photo the callback is not invoked the log is only the native photocamera is going to be opened after the second taken photo the log is note that the last line of log doesn t make sense i didn t cancelled the image capture the native photocamera is going to be opened the native photocamera is going to be opened actionlistener callback invoked capured image filepath storage emulated pictures registrazione img jpg the image capture was cancelled by the user e getsource is null after the third taken photo the log is the native photocamera is going to be opened the native photocamera is going to be opened actionlistener callback invoked capured image filepath storage emulated pictures registrazione img jpg the image capture was canceled by the user e getsource is null the native photocamera is going to be opened actionlistener callback invoked capured image filepath storage emulated pictures registrazione img jpg the image capture was canceled by the user e getsource is null
| 0
|
122,789
| 12,159,936,751
|
IssuesEvent
|
2020-04-26 11:15:12
|
PhuongPhg/SAVABLE
|
https://api.github.com/repos/PhuongPhg/SAVABLE
|
closed
|
Testing back-end problem
|
documentation help wanted
|
I don't know how to test them because we used the service and "it" do it, test it everytime it run (?)
Or we do a local database instead?
|
1.0
|
Testing back-end problem - I don't know how to test them because we used the service and "it" do it, test it everytime it run (?)
Or we do a local database instead?
|
non_process
|
testing back end problem i don t know how to test them because we used the service and it do it test it everytime it run or we do a local database instead
| 0
|
19,881
| 26,323,144,279
|
IssuesEvent
|
2023-01-10 02:39:05
|
processing/processing4
|
https://api.github.com/repos/processing/processing4
|
closed
|
cannot use @Override and @Deprecated in static mode
|
preprocessor
|
I just updated Processing for version 4.1.1 and now when I run sketch when there is
`@Override` `@Deprecated` inside `class` Processing don't run, that's a regression ? or there is a goal behind that ?
Before that's work fine withe processing 4.0b7... and all the version before that from 1.5.1
the console return:
```
Cannot find a class or type named @Overridepublic...
```
or
```
Cannot find a class or type named @Deprecatedpublic...
```
And always bravo for the Processing Team !!!!
|
1.0
|
cannot use @Override and @Deprecated in static mode - I just updated Processing for version 4.1.1 and now when I run sketch when there is
`@Override` `@Deprecated` inside `class` Processing don't run, that's a regression ? or there is a goal behind that ?
Before that's work fine withe processing 4.0b7... and all the version before that from 1.5.1
the console return:
```
Cannot find a class or type named @Overridepublic...
```
or
```
Cannot find a class or type named @Deprecatedpublic...
```
And always bravo for the Processing Team !!!!
|
process
|
cannot use override and deprecated in static mode i just updated processing for version and now when i run sketch when there is override deprecated inside class processing don t run that s a regression or there is a goal behind that before that s work fine withe processing and all the version before that from the console return cannot find a class or type named overridepublic or cannot find a class or type named deprecatedpublic and always bravo for the processing team
| 1
|
176,379
| 14,580,046,123
|
IssuesEvent
|
2020-12-18 08:29:56
|
yolkbaron/Code-Names
|
https://api.github.com/repos/yolkbaron/Code-Names
|
closed
|
Starting screen
|
documentation invalid
|
What are these random numbers in starting_screen.py? They should be taken from constants.py module.
|
1.0
|
Starting screen - What are these random numbers in starting_screen.py? They should be taken from constants.py module.
|
non_process
|
starting screen what are these random numbers in starting screen py they should be taken from constants py module
| 0
|
700,995
| 24,081,586,506
|
IssuesEvent
|
2022-09-19 07:10:14
|
jbx-protocol/juice-interface
|
https://api.github.com/repos/jbx-protocol/juice-interface
|
opened
|
Add CSV upload to all payouts
|
type:enhancement V2 V1 priority:1 ux:project owner
|
Add CSV Upload to:
- V2 FC reconfig
- V1 Edit payouts
- V1 FC reconfig
|
1.0
|
Add CSV upload to all payouts - Add CSV Upload to:
- V2 FC reconfig
- V1 Edit payouts
- V1 FC reconfig
|
non_process
|
add csv upload to all payouts add csv upload to fc reconfig edit payouts fc reconfig
| 0
|
258,962
| 22,361,351,816
|
IssuesEvent
|
2022-06-15 20:51:58
|
ossf/scorecard-action
|
https://api.github.com/repos/ossf/scorecard-action
|
closed
|
Failing e2e tests - scorecard-golang on ossf-tests/scorecard-action
|
e2e automated-tests
|
Matrix: {
"results_format": "json",
"publish_results": true,
"upload_result": false
}
Repo: https://github.com/ossf-tests/scorecard-action/tree/main
Run: https://github.com/ossf-tests/scorecard-action/actions/runs/2505038499
Workflow name: scorecard-golang
Workflow file: https://github.com/ossf-tests/scorecard-action/tree/main/.github/workflows/scorecards-golang.yml
Trigger: push
Branch: main
|
1.0
|
Failing e2e tests - scorecard-golang on ossf-tests/scorecard-action - Matrix: {
"results_format": "json",
"publish_results": true,
"upload_result": false
}
Repo: https://github.com/ossf-tests/scorecard-action/tree/main
Run: https://github.com/ossf-tests/scorecard-action/actions/runs/2505038499
Workflow name: scorecard-golang
Workflow file: https://github.com/ossf-tests/scorecard-action/tree/main/.github/workflows/scorecards-golang.yml
Trigger: push
Branch: main
|
non_process
|
failing tests scorecard golang on ossf tests scorecard action matrix results format json publish results true upload result false repo run workflow name scorecard golang workflow file trigger push branch main
| 0
|
3,749
| 6,733,150,305
|
IssuesEvent
|
2017-10-18 14:00:02
|
york-region-tpss/stp
|
https://api.github.com/repos/york-region-tpss/stp
|
closed
|
Watering Payment Workflow - Paid One Whole Assignment At A Time
|
enhancement process workflow
|
In the old system, watering assignments are paid by items.
Our clients request to pay by assignment, so that an assignment will be paid when any item in that assignment is paid.
|
1.0
|
Watering Payment Workflow - Paid One Whole Assignment At A Time - In the old system, watering assignments are paid by items.
Our clients request to pay by assignment, so that an assignment will be paid when any item in that assignment is paid.
|
process
|
watering payment workflow paid one whole assignment at a time in the old system watering assignments are paid by items our clients request to pay by assignment so that an assignment will be paid when any item in that assignment is paid
| 1
|
704,652
| 24,204,456,884
|
IssuesEvent
|
2022-09-25 02:30:31
|
all-contributors/app
|
https://api.github.com/repos/all-contributors/app
|
closed
|
contributorsSortAlphabetically doesn't sort alphabetically using bot
|
bug priority: high status: waiting for feedback pinned
|
**Describe the bug**
I've added `"contributorsSortAlphabetically": true,` to my repo's [`.allcontributorsrc.json` file](https://github.com/alan-turing-institute/the-turing-way/blob/master/.all-contributorsrc#L9). After I added a new contributor the table stayed in the same order and added the new contributor to the end of the table 😕
Here's a link to the table: https://github.com/alan-turing-institute/the-turing-way#contributors
**To Reproduce**
Steps to reproduce the behavior:
1. Create table of contributors - use the bot for a little while
2. Add `"contributorsSortAlphabetically": true,` to the .allcontributorsrc.json file
3. Ask the All Contributors Bot 🤖 to add a new contributor
4. Receive a pull request from the bot which adds the contributor to the json file and the rendered table, but which does not re-order the table or file.
**Expected behavior**
I was expecting the table in the README file to be sorted in alphabetical order - although I wasn't sure if it would be ordered by username, first name or last name...
**Additional context**
This feature was added in https://github.com/all-contributors/all-contributors-cli/pull/249 by @alexwlchan.
|
1.0
|
contributorsSortAlphabetically doesn't sort alphabetically using bot - **Describe the bug**
I've added `"contributorsSortAlphabetically": true,` to my repo's [`.allcontributorsrc.json` file](https://github.com/alan-turing-institute/the-turing-way/blob/master/.all-contributorsrc#L9). After I added a new contributor the table stayed in the same order and added the new contributor to the end of the table 😕
Here's a link to the table: https://github.com/alan-turing-institute/the-turing-way#contributors
**To Reproduce**
Steps to reproduce the behavior:
1. Create table of contributors - use the bot for a little while
2. Add `"contributorsSortAlphabetically": true,` to the .allcontributorsrc.json file
3. Ask the All Contributors Bot 🤖 to add a new contributor
4. Receive a pull request from the bot which adds the contributor to the json file and the rendered table, but which does not re-order the table or file.
**Expected behavior**
I was expecting the table in the README file to be sorted in alphabetical order - although I wasn't sure if it would be ordered by username, first name or last name...
**Additional context**
This feature was added in https://github.com/all-contributors/all-contributors-cli/pull/249 by @alexwlchan.
|
non_process
|
contributorssortalphabetically doesn t sort alphabetically using bot describe the bug i ve added contributorssortalphabetically true to my repo s after i added a new contributor the table stayed in the same order and added the new contributor to the end of the table 😕 here s a link to the table to reproduce steps to reproduce the behavior create table of contributors use the bot for a little while add contributorssortalphabetically true to the allcontributorsrc json file ask the all contributors bot 🤖 to add a new contributor receive a pull request from the bot which adds the contributor to the json file and the rendered table but which does not re order the table or file expected behavior i was expecting the table in the readme file to be sorted in alphabetical order although i wasn t sure if it would be ordered by username first name or last name additional context this feature was added in by alexwlchan
| 0
|
1,023
| 3,481,298,874
|
IssuesEvent
|
2015-12-29 15:13:53
|
kahanu/System.Linq.Dynamic
|
https://api.github.com/repos/kahanu/System.Linq.Dynamic
|
closed
|
Release Cycle/Nuget Release
|
process
|
@kahanu Issues #29 and #31 have been resolved and merged. What is the release cycle for nuget packages? How often do you typically push a new nuget package?
|
1.0
|
Release Cycle/Nuget Release - @kahanu Issues #29 and #31 have been resolved and merged. What is the release cycle for nuget packages? How often do you typically push a new nuget package?
|
process
|
release cycle nuget release kahanu issues and have been resolved and merged what is the release cycle for nuget packages how often do you typically push a new nuget package
| 1
|
10,182
| 13,044,162,854
|
IssuesEvent
|
2020-07-29 03:47:37
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `LeastInt` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `LeastInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `LeastInt` from TiDB -
## Description
Port the scalar function `LeastInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function leastint from tidb description port the scalar function leastint from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
160,176
| 25,116,503,757
|
IssuesEvent
|
2022-11-09 02:47:52
|
DeveloperAcademy-YOLO/ProjectYOLO
|
https://api.github.com/repos/DeveloperAcademy-YOLO/ProjectYOLO
|
closed
|
[Design] Change Sticker Button
|
design
|
# Please write a story
- Please write tasks
- 스티커 버튼의 색깔을 바꿉니다.
## 🎨 Design Screenshot



- Screenshot
## 🤔 Completion Conditions
- [ ] 기존 버튼 새 버튼으로 대치
|
1.0
|
[Design] Change Sticker Button - # Please write a story
- Please write tasks
- 스티커 버튼의 색깔을 바꿉니다.
## 🎨 Design Screenshot



- Screenshot
## 🤔 Completion Conditions
- [ ] 기존 버튼 새 버튼으로 대치
|
non_process
|
change sticker button please write a story please write tasks 스티커 버튼의 색깔을 바꿉니다 🎨 design screenshot screenshot 🤔 completion conditions 기존 버튼 새 버튼으로 대치
| 0
|
3,050
| 6,042,164,759
|
IssuesEvent
|
2017-06-11 10:13:57
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
Keyboard navigation over expanding rows
|
help wanted inprocess
|
Hello,
thank you for this create script. It really makes table handling easy. 👍
Currently I work on a table with subtables inside expanding rows.
It works great, but when I enable keyboard navigation, I could only navigate through the root table. For the tables in the expanded Row, I have to click again inside a cell.
Would it be possible to handle the navigation from row to row?
Also it would be great to expand/hide rows with the keyboard.
|
1.0
|
Keyboard navigation over expanding rows - Hello,
thank you for this create script. It really makes table handling easy. 👍
Currently I work on a table with subtables inside expanding rows.
It works great, but when I enable keyboard navigation, I could only navigate through the root table. For the tables in the expanded Row, I have to click again inside a cell.
Would it be possible to handle the navigation from row to row?
Also it would be great to expand/hide rows with the keyboard.
|
process
|
keyboard navigation over expanding rows hello thank you for this create script it really makes table handling easy 👍 currently i work on a table with subtables inside expanding rows it works great but when i enable keyboard navigation i could only navigate through the root table for the tables in the expanded row i have to click again inside a cell would it be possible to handle the navigation from row to row also it would be great to expand hide rows with the keyboard
| 1
|
11,517
| 14,399,724,850
|
IssuesEvent
|
2020-12-03 11:20:04
|
syncfusion/ej2-react-ui-components
|
https://api.github.com/repos/syncfusion/ej2-react-ui-components
|
closed
|
Red Canvas on Google Chrome
|
need-more-information word-processor
|
One of the canvas elements has a red background on load when using the latest stable Google Chrome version (79.0.3945.88). You can view the issue from the online demo (https://ej2.syncfusion.com/react/demos/#/material/document-editor/default)
|
1.0
|
Red Canvas on Google Chrome - One of the canvas elements has a red background on load when using the latest stable Google Chrome version (79.0.3945.88). You can view the issue from the online demo (https://ej2.syncfusion.com/react/demos/#/material/document-editor/default)
|
process
|
red canvas on google chrome one of the canvas elements has a red background on load when using the latest stable google chrome version you can view the issue from the online demo
| 1
|
9,966
| 13,011,938,217
|
IssuesEvent
|
2020-07-25 02:12:02
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
reopened
|
Switch to python3
|
needs-info type: process
|
https://github.com/firebase/firebase-cpp-sdk#prerequisites requires Python2. Python2 is deprecated and has run out of support. It's not available on my system anymore (debian bullseye). Please switch to python3 which also has abseil-py and protobuf available.
|
1.0
|
Switch to python3 - https://github.com/firebase/firebase-cpp-sdk#prerequisites requires Python2. Python2 is deprecated and has run out of support. It's not available on my system anymore (debian bullseye). Please switch to python3 which also has abseil-py and protobuf available.
|
process
|
switch to requires is deprecated and has run out of support it s not available on my system anymore debian bullseye please switch to which also has abseil py and protobuf available
| 1
|
235,420
| 18,049,174,001
|
IssuesEvent
|
2021-09-19 12:43:58
|
girlscript/winter-of-contributing
|
https://api.github.com/repos/girlscript/winter-of-contributing
|
closed
|
Flutter: Introduction to Dart
|
documentation GWOC21 Flutter
|
A README file to understand dart programming language.
It should cover all the basics and some advanced topics of DART, and maybe sections can be divided into several contributors.
Those who want to contribute can comment the topics they would like to cover.
Topics to cover:
1. Dart Variables
2. Dart Data Types
3. Dart Functions
4. Dart Arrow Function
5. Dart Lists
6. Dart Conditionals
7. Dart classes and objects
8. Dart Class constructors
9. Any other topics that you may find useful
References:
https://github.com/Asabeneh/30-Days-Of-JavaScript/blob/master/04_Day_Conditionals/04_day_conditionals.md
(This is JS repository, ours should also look like this only)
Dart: https://dart.dev/
|
1.0
|
Flutter: Introduction to Dart - A README file to understand dart programming language.
It should cover all the basics and some advanced topics of DART, and maybe sections can be divided into several contributors.
Those who want to contribute can comment the topics they would like to cover.
Topics to cover:
1. Dart Variables
2. Dart Data Types
3. Dart Functions
4. Dart Arrow Function
5. Dart Lists
6. Dart Conditionals
7. Dart classes and objects
8. Dart Class constructors
9. Any other topics that you may find useful
References:
https://github.com/Asabeneh/30-Days-Of-JavaScript/blob/master/04_Day_Conditionals/04_day_conditionals.md
(This is JS repository, ours should also look like this only)
Dart: https://dart.dev/
|
non_process
|
flutter introduction to dart a readme file to understand dart programming language it should cover all the basics and some advanced topics of dart and maybe sections can be divided into several contributors those who want to contribute can comment the topics they would like to cover topics to cover dart variables dart data types dart functions dart arrow function dart lists dart conditionals dart classes and objects dart class constructors any other topics that you may find useful references this is js repository ours should also look like this only dart
| 0
|
313
| 2,753,822,668
|
IssuesEvent
|
2015-04-25 02:32:42
|
google/truth
|
https://api.github.com/repos/google/truth
|
closed
|
Lack of plugin versions causes build API mismatch
|
Type: Process
|
```
[WARNING]
[WARNING] Some problems were encountered while building the effective model for com.google.truth:truth:jar:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 72, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-surefire-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 136, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-jar-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 110, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-source-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 95, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-javadoc-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 82, column 15
[WARNING]
[WARNING] Some problems were encountered while building the effective model for com.google.truth:truth-parent:pom:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-jar-plugin is missing. @ line 39, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-source-plugin is missing. @ line 61, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-javadoc-plugin is missing. @ line 50, column 15
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
```
```
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-source-plugin:2.4:jar (default-cli) on project truth: Execution default-cli of goal org.apache.maven.plugins:maven-source-plugin:2.4:jar failed: An API incompatibility was encountered while executing org.apache.maven.plugins:maven-source-plugin:2.4:jar: java.lang.NoSuchMethodError: org.codehaus.plexus.components.io.attributes.Java7Reflector.isAtLeastJava7()Z
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.apache.maven.plugins:maven-source-plugin:2.4
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/Users/jw/.m2/repository/org/apache/maven/plugins/maven-source-plugin/2.4/maven-source-plugin-2.4.jar
[ERROR] urls[1] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-io/2.0.9/plexus-io-2.0.9.jar
[ERROR] urls[2] = file:/Users/jw/.m2/repository/backport-util-concurrent/backport-util-concurrent/3.1/backport-util-concurrent-3.1.jar
[ERROR] urls[3] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.11/plexus-interpolation-1.11.jar
[ERROR] urls[4] = file:/Users/jw/.m2/repository/junit/junit/3.8.1/junit-3.8.1.jar
[ERROR] urls[5] = file:/Users/jw/.m2/repository/org/apache/maven/maven-archiver/2.5/maven-archiver-2.5.jar
[ERROR] urls[6] = file:/Users/jw/.m2/repository/org/apache/maven/reporting/maven-reporting-api/2.0.6/maven-reporting-api-2.0.6.jar
[ERROR] urls[7] = file:/Users/jw/.m2/repository/org/apache/maven/doxia/doxia-sink-api/1.0-alpha-7/doxia-sink-api-1.0-alpha-7.jar
[ERROR] urls[8] = file:/Users/jw/.m2/repository/commons-cli/commons-cli/1.0/commons-cli-1.0.jar
[ERROR] urls[9] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-interactivity-api/1.0-alpha-4/plexus-interactivity-api-1.0-alpha-4.jar
[ERROR] urls[10] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-archiver/2.6.3/plexus-archiver-2.6.3.jar
[ERROR] urls[11] = file:/Users/jw/.m2/repository/org/apache/commons/commons-compress/1.8.1/commons-compress-1.8.1.jar
[ERROR] urls[12] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-utils/3.0.18/plexus-utils-3.0.18.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
[ERROR]
[ERROR] -----------------------------------------------------
```
```
$ mvn -version
Apache Maven 3.2.3 (33f8c3e1027c3ddde99d3cdebad2656a31e8fdf4; 2014-08-11T13:58:10-07:00)
Maven home: /usr/local/Cellar/maven/3.2.3/libexec
Java version: 1.8.0_25, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.9.4", arch: "x86_64", family: "mac"
```
|
1.0
|
Lack of plugin versions causes build API mismatch - ```
[WARNING]
[WARNING] Some problems were encountered while building the effective model for com.google.truth:truth:jar:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 72, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-surefire-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 136, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-jar-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 110, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-source-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 95, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-javadoc-plugin is missing. @ com.google.truth:truth:[unknown-version], /Users/jw/dev/other/truth/core/pom.xml, line 82, column 15
[WARNING]
[WARNING] Some problems were encountered while building the effective model for com.google.truth:truth-parent:pom:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-jar-plugin is missing. @ line 39, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-source-plugin is missing. @ line 61, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-javadoc-plugin is missing. @ line 50, column 15
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
```
```
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-source-plugin:2.4:jar (default-cli) on project truth: Execution default-cli of goal org.apache.maven.plugins:maven-source-plugin:2.4:jar failed: An API incompatibility was encountered while executing org.apache.maven.plugins:maven-source-plugin:2.4:jar: java.lang.NoSuchMethodError: org.codehaus.plexus.components.io.attributes.Java7Reflector.isAtLeastJava7()Z
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.apache.maven.plugins:maven-source-plugin:2.4
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/Users/jw/.m2/repository/org/apache/maven/plugins/maven-source-plugin/2.4/maven-source-plugin-2.4.jar
[ERROR] urls[1] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-io/2.0.9/plexus-io-2.0.9.jar
[ERROR] urls[2] = file:/Users/jw/.m2/repository/backport-util-concurrent/backport-util-concurrent/3.1/backport-util-concurrent-3.1.jar
[ERROR] urls[3] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.11/plexus-interpolation-1.11.jar
[ERROR] urls[4] = file:/Users/jw/.m2/repository/junit/junit/3.8.1/junit-3.8.1.jar
[ERROR] urls[5] = file:/Users/jw/.m2/repository/org/apache/maven/maven-archiver/2.5/maven-archiver-2.5.jar
[ERROR] urls[6] = file:/Users/jw/.m2/repository/org/apache/maven/reporting/maven-reporting-api/2.0.6/maven-reporting-api-2.0.6.jar
[ERROR] urls[7] = file:/Users/jw/.m2/repository/org/apache/maven/doxia/doxia-sink-api/1.0-alpha-7/doxia-sink-api-1.0-alpha-7.jar
[ERROR] urls[8] = file:/Users/jw/.m2/repository/commons-cli/commons-cli/1.0/commons-cli-1.0.jar
[ERROR] urls[9] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-interactivity-api/1.0-alpha-4/plexus-interactivity-api-1.0-alpha-4.jar
[ERROR] urls[10] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-archiver/2.6.3/plexus-archiver-2.6.3.jar
[ERROR] urls[11] = file:/Users/jw/.m2/repository/org/apache/commons/commons-compress/1.8.1/commons-compress-1.8.1.jar
[ERROR] urls[12] = file:/Users/jw/.m2/repository/org/codehaus/plexus/plexus-utils/3.0.18/plexus-utils-3.0.18.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
[ERROR]
[ERROR] -----------------------------------------------------
```
```
$ mvn -version
Apache Maven 3.2.3 (33f8c3e1027c3ddde99d3cdebad2656a31e8fdf4; 2014-08-11T13:58:10-07:00)
Maven home: /usr/local/Cellar/maven/3.2.3/libexec
Java version: 1.8.0_25, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.9.4", arch: "x86_64", family: "mac"
```
|
process
|
lack of plugin versions causes build api mismatch some problems were encountered while building the effective model for com google truth truth jar snapshot build plugins plugin version for org apache maven plugins maven compiler plugin is missing com google truth truth users jw dev other truth core pom xml line column build plugins plugin version for org apache maven plugins maven surefire plugin is missing com google truth truth users jw dev other truth core pom xml line column build plugins plugin version for org apache maven plugins maven jar plugin is missing com google truth truth users jw dev other truth core pom xml line column build plugins plugin version for org apache maven plugins maven source plugin is missing com google truth truth users jw dev other truth core pom xml line column build plugins plugin version for org apache maven plugins maven javadoc plugin is missing com google truth truth users jw dev other truth core pom xml line column some problems were encountered while building the effective model for com google truth truth parent pom snapshot build plugins plugin version for org apache maven plugins maven jar plugin is missing line column build plugins plugin version for org apache maven plugins maven source plugin is missing line column build plugins plugin version for org apache maven plugins maven javadoc plugin is missing line column it is highly recommended to fix these problems because they threaten the stability of your build for this reason future maven versions might no longer support building such malformed projects failed to execute goal org apache maven plugins maven source plugin jar default cli on project truth execution default cli of goal org apache maven plugins maven source plugin jar failed an api incompatibility was encountered while executing org apache maven plugins maven source plugin jar java lang nosuchmethoderror org codehaus plexus components io attributes z realm plugin org apache maven plugins maven source plugin strategy org codehaus plexus classworlds strategy selffirststrategy urls file users jw repository org apache maven plugins maven source plugin maven source plugin jar urls file users jw repository org codehaus plexus plexus io plexus io jar urls file users jw repository backport util concurrent backport util concurrent backport util concurrent jar urls file users jw repository org codehaus plexus plexus interpolation plexus interpolation jar urls file users jw repository junit junit junit jar urls file users jw repository org apache maven maven archiver maven archiver jar urls file users jw repository org apache maven reporting maven reporting api maven reporting api jar urls file users jw repository org apache maven doxia doxia sink api alpha doxia sink api alpha jar urls file users jw repository commons cli commons cli commons cli jar urls file users jw repository org codehaus plexus plexus interactivity api alpha plexus interactivity api alpha jar urls file users jw repository org codehaus plexus plexus archiver plexus archiver jar urls file users jw repository org apache commons commons compress commons compress jar urls file users jw repository org codehaus plexus plexus utils plexus utils jar number of foreign imports import entry mvn version apache maven maven home usr local cellar maven libexec java version vendor oracle corporation java home library java javavirtualmachines jdk contents home jre default locale en us platform encoding utf os name mac os x version arch family mac
| 1
|
760,994
| 26,662,836,776
|
IssuesEvent
|
2023-01-25 23:00:55
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
closed
|
Missing name gsubs when creating a new plugin with bridgetown-1.2.0.beta4
|
bug high priority
|
```
% bridgetown plugins new bridgetown-credentials
run git clone -b v1.2-initializer https://github.com/bridgetownrb/bridgetown-sample-plugin bridgetown-credentials from "."
Cloning into 'bridgetown-credentials'...
remote: Enumerating objects: 223, done.
remote: Counting objects: 100% (223/223), done.
remote: Compressing objects: 100% (165/165), done.
remote: Total 223 (delta 89), reused 143 (delta 35), pack-reused 0
Receiving objects: 100% (223/223), 840.49 KiB | 8.94 MiB/s, done.
Resolving deltas: 100% (89/89), done.
run rm -rf .git from "./bridgetown-credentials"
run git init from "./bridgetown-credentials"
Initialized empty Git repository in /Users/svoop/Development/ofm/web/bridgetown-credentials/.git/
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/package.json
gsub bridgetown-credentials/package.json
gsub bridgetown-credentials/lib/bridgetown-credentials.rb
Exception raised: Errno::ENOENT
No such file or directory @ rb_sysopen - /Users/svoop/Development/ofm/web/bridgetown-credentials/lib/bridgetown-credentials.rb
1: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions/file_manipulation.rb:274:in `binread'
2: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions/file_manipulation.rb:274:in `gsub_file'
3: /Users/svoop/.gem/ruby/3.1.0/gems/bridgetown-core-1.2.0.beta4/lib/bridgetown-core/commands/plugins.rb:202:in `block in new'
4: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions.rb:190:in `block in inside'
5: /Users/svoop/.rubies/ruby-3.1.3/lib/ruby/3.1.0/fileutils.rb:139:in `chdir'
Backtrace: Use the --trace option for complete information.
```
**Bridgetown Version**:
1.2.0.beta4
**Computing environment (please complete the following information):**
- OS: macOS 13.1 ARM
- Ruby Version: 3.1.3
|
1.0
|
Missing name gsubs when creating a new plugin with bridgetown-1.2.0.beta4 - ```
% bridgetown plugins new bridgetown-credentials
run git clone -b v1.2-initializer https://github.com/bridgetownrb/bridgetown-sample-plugin bridgetown-credentials from "."
Cloning into 'bridgetown-credentials'...
remote: Enumerating objects: 223, done.
remote: Counting objects: 100% (223/223), done.
remote: Compressing objects: 100% (165/165), done.
remote: Total 223 (delta 89), reused 143 (delta 35), pack-reused 0
Receiving objects: 100% (223/223), 840.49 KiB | 8.94 MiB/s, done.
Resolving deltas: 100% (89/89), done.
run rm -rf .git from "./bridgetown-credentials"
run git init from "./bridgetown-credentials"
Initialized empty Git repository in /Users/svoop/Development/ofm/web/bridgetown-credentials/.git/
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/bridgetown-credentials.gemspec
gsub bridgetown-credentials/package.json
gsub bridgetown-credentials/package.json
gsub bridgetown-credentials/lib/bridgetown-credentials.rb
Exception raised: Errno::ENOENT
No such file or directory @ rb_sysopen - /Users/svoop/Development/ofm/web/bridgetown-credentials/lib/bridgetown-credentials.rb
1: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions/file_manipulation.rb:274:in `binread'
2: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions/file_manipulation.rb:274:in `gsub_file'
3: /Users/svoop/.gem/ruby/3.1.0/gems/bridgetown-core-1.2.0.beta4/lib/bridgetown-core/commands/plugins.rb:202:in `block in new'
4: /Users/svoop/.gem/ruby/3.1.0/gems/thor-1.2.1/lib/thor/actions.rb:190:in `block in inside'
5: /Users/svoop/.rubies/ruby-3.1.3/lib/ruby/3.1.0/fileutils.rb:139:in `chdir'
Backtrace: Use the --trace option for complete information.
```
**Bridgetown Version**:
1.2.0.beta4
**Computing environment (please complete the following information):**
- OS: macOS 13.1 ARM
- Ruby Version: 3.1.3
|
non_process
|
missing name gsubs when creating a new plugin with bridgetown bridgetown plugins new bridgetown credentials run git clone b initializer bridgetown credentials from cloning into bridgetown credentials remote enumerating objects done remote counting objects done remote compressing objects done remote total delta reused delta pack reused receiving objects kib mib s done resolving deltas done run rm rf git from bridgetown credentials run git init from bridgetown credentials initialized empty git repository in users svoop development ofm web bridgetown credentials git gsub bridgetown credentials bridgetown credentials gemspec gsub bridgetown credentials bridgetown credentials gemspec gsub bridgetown credentials bridgetown credentials gemspec gsub bridgetown credentials bridgetown credentials gemspec gsub bridgetown credentials package json gsub bridgetown credentials package json gsub bridgetown credentials lib bridgetown credentials rb exception raised errno enoent no such file or directory rb sysopen users svoop development ofm web bridgetown credentials lib bridgetown credentials rb users svoop gem ruby gems thor lib thor actions file manipulation rb in binread users svoop gem ruby gems thor lib thor actions file manipulation rb in gsub file users svoop gem ruby gems bridgetown core lib bridgetown core commands plugins rb in block in new users svoop gem ruby gems thor lib thor actions rb in block in inside users svoop rubies ruby lib ruby fileutils rb in chdir backtrace use the trace option for complete information bridgetown version computing environment please complete the following information os macos arm ruby version
| 0
|
1,544
| 4,153,935,502
|
IssuesEvent
|
2016-06-16 09:38:51
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
use the standard ovs log format
|
priority_critical process_wontfix type_bug
|
Not all the logs have the standard ovs format. [log format](https://openvstorage.gitbooks.io/framework/content/docs/log.html)
ovs-workers.log
```
2016-06-09 16:52:08 13200 +0200 - cmp02 - 25562/139660310509376 - celery/celery.worker.job - 159 - INFO - Task ovs.storagerouter.ping[d7c6fd40-4d52-4f92-b9d1-7556417d3146] succeeded in 0.119225164875s: None
[2016-06-09 16:52:13,739: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:17,323: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:17,383: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:18,436: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
libust[8212/8212]: Warning: HOME environment variable not set. Disabling LTTng-UST per-user tracing. (in setup_local_apps() at lttng-ust-comm.c:305)
```
ovs-webapp-api.log
```
2016-06-13 13:22:29 [26764] [INFO] Booting worker with pid: 26764
2016-06-13 13:22:29 [26774] [INFO] Booting worker with pid: 26774
2016-06-13 13:22:29 [26777] [INFO] Booting worker with pid: 26777
2016-06-13 13:22:29 [26779] [INFO] Booting worker with pid: 26779
2016-06-13 13:48:15 [26761] [INFO] Worker exiting (pid: 26761)
2016-06-13 13:48:15 [26764] [INFO] Worker exiting (pid: 26764)
2016-06-13 13:48:15 [26774] [INFO] Worker exiting (pid: 26774)
2016-06-13 13:48:15 [26779] [INFO] Worker exiting (pid: 26779)
2016-06-13 13:48:15 [26777] [INFO] Worker exiting (pid: 26777)
2016-06-13 13:48:15 [26731] [INFO] Handling signal: term
2016-06-13 13:48:15 [26731] [INFO] Shutting down: Master
2016-06-13 13:48:19 [3320] [INFO] Starting gunicorn 17.5
2016-06-13 13:48:19 [3320] [INFO] Listening at: http://127.0.0.1:8002 (3320)
2016-06-13 13:48:19 [3320] [INFO] Using worker: gevent
2016-06-13 13:48:19 [3345] [INFO] Booting worker with pid: 3345
2016-06-13 13:48:19 [3346] [INFO] Booting worker with pid: 3346
2016-06-13 13:48:19 [3355] [INFO] Booting worker with pid: 3355
2016-06-13 13:48:20 [3356] [INFO] Booting worker with pid: 3356
2016-06-13 13:48:20 [3360] [INFO] Booting worker with pid: 3360
2016-06-13 14:35:37 [3345] [INFO] Worker exiting (pid: 3345)
2016-06-13 14:35:37 [3346] [INFO] Worker exiting (pid: 3346)
2016-06-13 14:35:37 [3355] [INFO] Worker exiting (pid: 3355)
2016-06-13 14:35:37 [3356] [INFO] Worker exiting (pid: 3356)
2016-06-13 14:35:37 [3320] [INFO] Handling signal: term
2016-06-13 14:35:37 [3360] [INFO] Worker exiting (pid: 3360)
2016-06-13 14:35:37 [3320] [INFO] Shutting down: Master
2016-06-13 14:35:42 [22761] [INFO] Starting gunicorn 17.5
2016-06-13 14:35:42 [22761] [INFO] Listening at: http://127.0.0.1:8002 (22761)
2016-06-13 14:35:42 [22761] [INFO] Using worker: gevent
2016-06-13 14:35:42 [22792] [INFO] Booting worker with pid: 22792
2016-06-13 14:35:42 [22793] [INFO] Booting worker with pid: 22793
2016-06-13 14:35:42 [22794] [INFO] Booting worker with pid: 22794
2016-06-13 14:35:42 [22795] [INFO] Booting worker with pid: 22795
2016-06-13 14:35:42 [22796] [INFO] Booting worker with pid: 22796
```
|
1.0
|
use the standard ovs log format - Not all the logs have the standard ovs format. [log format](https://openvstorage.gitbooks.io/framework/content/docs/log.html)
ovs-workers.log
```
2016-06-09 16:52:08 13200 +0200 - cmp02 - 25562/139660310509376 - celery/celery.worker.job - 159 - INFO - Task ovs.storagerouter.ping[d7c6fd40-4d52-4f92-b9d1-7556417d3146] succeeded in 0.119225164875s: None
[2016-06-09 16:52:13,739: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:17,323: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:17,383: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
[2016-06-09 16:52:18,436: INFO/Worker-1] Starting new HTTPS connection (1): 10.100.199.3
libust[8212/8212]: Warning: HOME environment variable not set. Disabling LTTng-UST per-user tracing. (in setup_local_apps() at lttng-ust-comm.c:305)
```
ovs-webapp-api.log
```
2016-06-13 13:22:29 [26764] [INFO] Booting worker with pid: 26764
2016-06-13 13:22:29 [26774] [INFO] Booting worker with pid: 26774
2016-06-13 13:22:29 [26777] [INFO] Booting worker with pid: 26777
2016-06-13 13:22:29 [26779] [INFO] Booting worker with pid: 26779
2016-06-13 13:48:15 [26761] [INFO] Worker exiting (pid: 26761)
2016-06-13 13:48:15 [26764] [INFO] Worker exiting (pid: 26764)
2016-06-13 13:48:15 [26774] [INFO] Worker exiting (pid: 26774)
2016-06-13 13:48:15 [26779] [INFO] Worker exiting (pid: 26779)
2016-06-13 13:48:15 [26777] [INFO] Worker exiting (pid: 26777)
2016-06-13 13:48:15 [26731] [INFO] Handling signal: term
2016-06-13 13:48:15 [26731] [INFO] Shutting down: Master
2016-06-13 13:48:19 [3320] [INFO] Starting gunicorn 17.5
2016-06-13 13:48:19 [3320] [INFO] Listening at: http://127.0.0.1:8002 (3320)
2016-06-13 13:48:19 [3320] [INFO] Using worker: gevent
2016-06-13 13:48:19 [3345] [INFO] Booting worker with pid: 3345
2016-06-13 13:48:19 [3346] [INFO] Booting worker with pid: 3346
2016-06-13 13:48:19 [3355] [INFO] Booting worker with pid: 3355
2016-06-13 13:48:20 [3356] [INFO] Booting worker with pid: 3356
2016-06-13 13:48:20 [3360] [INFO] Booting worker with pid: 3360
2016-06-13 14:35:37 [3345] [INFO] Worker exiting (pid: 3345)
2016-06-13 14:35:37 [3346] [INFO] Worker exiting (pid: 3346)
2016-06-13 14:35:37 [3355] [INFO] Worker exiting (pid: 3355)
2016-06-13 14:35:37 [3356] [INFO] Worker exiting (pid: 3356)
2016-06-13 14:35:37 [3320] [INFO] Handling signal: term
2016-06-13 14:35:37 [3360] [INFO] Worker exiting (pid: 3360)
2016-06-13 14:35:37 [3320] [INFO] Shutting down: Master
2016-06-13 14:35:42 [22761] [INFO] Starting gunicorn 17.5
2016-06-13 14:35:42 [22761] [INFO] Listening at: http://127.0.0.1:8002 (22761)
2016-06-13 14:35:42 [22761] [INFO] Using worker: gevent
2016-06-13 14:35:42 [22792] [INFO] Booting worker with pid: 22792
2016-06-13 14:35:42 [22793] [INFO] Booting worker with pid: 22793
2016-06-13 14:35:42 [22794] [INFO] Booting worker with pid: 22794
2016-06-13 14:35:42 [22795] [INFO] Booting worker with pid: 22795
2016-06-13 14:35:42 [22796] [INFO] Booting worker with pid: 22796
```
|
process
|
use the standard ovs log format not all the logs have the standard ovs format ovs workers log celery celery worker job info task ovs storagerouter ping succeeded in none starting new https connection starting new https connection starting new https connection starting new https connection libust warning home environment variable not set disabling lttng ust per user tracing in setup local apps at lttng ust comm c ovs webapp api log booting worker with pid booting worker with pid booting worker with pid booting worker with pid worker exiting pid worker exiting pid worker exiting pid worker exiting pid worker exiting pid handling signal term shutting down master starting gunicorn listening at using worker gevent booting worker with pid booting worker with pid booting worker with pid booting worker with pid booting worker with pid worker exiting pid worker exiting pid worker exiting pid worker exiting pid handling signal term worker exiting pid shutting down master starting gunicorn listening at using worker gevent booting worker with pid booting worker with pid booting worker with pid booting worker with pid booting worker with pid
| 1
|
265,700
| 28,298,045,980
|
IssuesEvent
|
2023-04-10 01:28:30
|
nk7598/linux-4.19.72
|
https://api.github.com/repos/nk7598/linux-4.19.72
|
closed
|
WS-2021-0529 (Medium) detected in linuxlinux-4.19.269 - autoclosed
|
Mend: dependency security vulnerability
|
## WS-2021-0529 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.269</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Linux/Kernel is vulnerable to use after free in rebalance_children() in ems_pcmcia_add_card() in drivers/md/persistent-data/dm-btree-remove.c
<p>Publish Date: 2021-12-01
<p>URL: <a href=https://github.com/gregkh/linux/commit/607beb420b3fe23b948a9bf447d993521a02fbbb>WS-2021-0529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002737">https://osv.dev/vulnerability/GSD-2021-1002737</a></p>
<p>Release Date: 2021-12-01</p>
<p>Fix Resolution: Linux/Kernel - v5.10.88, v5.15.11, v5.15.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2021-0529 (Medium) detected in linuxlinux-4.19.269 - autoclosed - ## WS-2021-0529 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.269</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/md/persistent-data/dm-btree-remove.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Linux/Kernel is vulnerable to use after free in rebalance_children() in ems_pcmcia_add_card() in drivers/md/persistent-data/dm-btree-remove.c
<p>Publish Date: 2021-12-01
<p>URL: <a href=https://github.com/gregkh/linux/commit/607beb420b3fe23b948a9bf447d993521a02fbbb>WS-2021-0529</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002737">https://osv.dev/vulnerability/GSD-2021-1002737</a></p>
<p>Release Date: 2021-12-01</p>
<p>Fix Resolution: Linux/Kernel - v5.10.88, v5.15.11, v5.15.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in linuxlinux autoclosed ws medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in base branch master vulnerable source files drivers md persistent data dm btree remove c drivers md persistent data dm btree remove c drivers md persistent data dm btree remove c vulnerability details linux kernel is vulnerable to use after free in rebalance children in ems pcmcia add card in drivers md persistent data dm btree remove c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution linux kernel step up your open source security game with mend
| 0
|
14,700
| 17,873,252,317
|
IssuesEvent
|
2021-09-06 20:01:00
|
googleapis/google-cloud-dotnet
|
https://api.github.com/repos/googleapis/google-cloud-dotnet
|
opened
|
Warning: a recent release failed
|
type: process
|
The following release PRs may have failed:
* #7124
* #7126
* #7133
* #7125
* #7128
* #7123
* #7138
* #7127
* #7137
|
1.0
|
Warning: a recent release failed - The following release PRs may have failed:
* #7124
* #7126
* #7133
* #7125
* #7128
* #7123
* #7138
* #7127
* #7137
|
process
|
warning a recent release failed the following release prs may have failed
| 1
|
79,077
| 9,820,470,253
|
IssuesEvent
|
2019-06-14 02:46:14
|
cityofaustin/techstack
|
https://api.github.com/repos/cityofaustin/techstack
|
closed
|
Joplin - Implementation Matches Design - Assign pages to departments
|
Author Interface Team: Design + Research Team: Dev Wagtail
|
As a content editor, I need to be able to assign a info or service page to a department, so that when it is published it is associated on janis with the correct department.
Not quite sure how pages are currently assigned to a dept - some invisible mechanism on the backend?
|
1.0
|
Joplin - Implementation Matches Design - Assign pages to departments - As a content editor, I need to be able to assign a info or service page to a department, so that when it is published it is associated on janis with the correct department.
Not quite sure how pages are currently assigned to a dept - some invisible mechanism on the backend?
|
non_process
|
joplin implementation matches design assign pages to departments as a content editor i need to be able to assign a info or service page to a department so that when it is published it is associated on janis with the correct department not quite sure how pages are currently assigned to a dept some invisible mechanism on the backend
| 0
|
282,160
| 8,704,177,498
|
IssuesEvent
|
2018-12-05 18:39:58
|
lbryio/lbry
|
https://api.github.com/repos/lbryio/lbry
|
closed
|
tx_list error: Invalid claim update state, expected to find previous claim in input
|
area: wallet priority: blocker type: bug
|
<!--
Thanks for reporting an issue to LBRY and helping us improve!
To make it possible for us to help you, please fill out below information carefully.
Before reporting any issues, please make sure that you're using the latest version.
- App: https://github.com/lbryio/lbry-desktop/releases
- Daemon: https://github.com/lbryio/lbry/releases
We are also available on Discord at https://chat.lbry.io
-->
## The Issue
On rc7 after transaction_list changes. Re-sync does not help.
```
C:\Users\thoma\Desktop\latestdaemomn\030rc2>lbrynet transaction_list
{
"code": -32500,
"data": [
" File \"twisted\\internet\\defer.py\", line 654, in _runCallbacks",
" ",
" File \"lbrynet\\extras\\daemon\\auth\\server.py\", line 96, in trap",
" ",
" File \"twisted\\python\\failure.py\", line 439, in trap",
" ",
" File \"twisted\\python\\failure.py\", line 467, in raiseException",
" ",
" File \"twisted\\internet\\defer.py\", line 824, in adapt",
" ",
" File \"lbrynet\\extras\\daemon\\Daemon.py\", line 93, in maybe_paginate",
" ",
" File \"lbrynet\\extras\\wallet\\manager.py\", line 339, in get_history",
" ",
"builtins.AssertionError: Invalid claim update state, expected to find previous claim in input."
],
"message": "Invalid claim update state, expected to find previous claim in input."
}
```
## System Configuration
<!-- For the app, this info is in the About section at the bottom of the Help page.
You can include a screenshot instead of typing it out -->
<!-- For the daemon, run:
curl 'http://localhost:5279' --data '{"method":"version"}'
and include the full output -->
- LBRY Daemon version:
- LBRY App version:
- LBRY Installation ID:
- Operating system:
## Anything Else
<!-- Include anything else that does not fit into the above sections -->
## Screenshots
<!-- If a screenshot would help explain the bug, please include one or two here -->
## Internal Use
### Acceptance Criteria
1.
2.
3.
### Definition of Done
- [ ] Tested against acceptance criteria
- [ ] Tested against the assumptions of user story
- [ ] The project builds without errors
- [ ] Unit tests are written and passing
- [ ] Tests on devices/browsers listed in the issue have passed
- [ ] QA performed & issues resolved
- [ ] Refactoring completed
- [ ] Any configuration or build changes documented
- [ ] Documentation updated
- [ ] Peer Code Review performed
|
1.0
|
tx_list error: Invalid claim update state, expected to find previous claim in input - <!--
Thanks for reporting an issue to LBRY and helping us improve!
To make it possible for us to help you, please fill out below information carefully.
Before reporting any issues, please make sure that you're using the latest version.
- App: https://github.com/lbryio/lbry-desktop/releases
- Daemon: https://github.com/lbryio/lbry/releases
We are also available on Discord at https://chat.lbry.io
-->
## The Issue
On rc7 after transaction_list changes. Re-sync does not help.
```
C:\Users\thoma\Desktop\latestdaemomn\030rc2>lbrynet transaction_list
{
"code": -32500,
"data": [
" File \"twisted\\internet\\defer.py\", line 654, in _runCallbacks",
" ",
" File \"lbrynet\\extras\\daemon\\auth\\server.py\", line 96, in trap",
" ",
" File \"twisted\\python\\failure.py\", line 439, in trap",
" ",
" File \"twisted\\python\\failure.py\", line 467, in raiseException",
" ",
" File \"twisted\\internet\\defer.py\", line 824, in adapt",
" ",
" File \"lbrynet\\extras\\daemon\\Daemon.py\", line 93, in maybe_paginate",
" ",
" File \"lbrynet\\extras\\wallet\\manager.py\", line 339, in get_history",
" ",
"builtins.AssertionError: Invalid claim update state, expected to find previous claim in input."
],
"message": "Invalid claim update state, expected to find previous claim in input."
}
```
## System Configuration
<!-- For the app, this info is in the About section at the bottom of the Help page.
You can include a screenshot instead of typing it out -->
<!-- For the daemon, run:
curl 'http://localhost:5279' --data '{"method":"version"}'
and include the full output -->
- LBRY Daemon version:
- LBRY App version:
- LBRY Installation ID:
- Operating system:
## Anything Else
<!-- Include anything else that does not fit into the above sections -->
## Screenshots
<!-- If a screenshot would help explain the bug, please include one or two here -->
## Internal Use
### Acceptance Criteria
1.
2.
3.
### Definition of Done
- [ ] Tested against acceptance criteria
- [ ] Tested against the assumptions of user story
- [ ] The project builds without errors
- [ ] Unit tests are written and passing
- [ ] Tests on devices/browsers listed in the issue have passed
- [ ] QA performed & issues resolved
- [ ] Refactoring completed
- [ ] Any configuration or build changes documented
- [ ] Documentation updated
- [ ] Peer Code Review performed
|
non_process
|
tx list error invalid claim update state expected to find previous claim in input thanks for reporting an issue to lbry and helping us improve to make it possible for us to help you please fill out below information carefully before reporting any issues please make sure that you re using the latest version app daemon we are also available on discord at the issue on after transaction list changes re sync does not help c users thoma desktop latestdaemomn lbrynet transaction list code data file twisted internet defer py line in runcallbacks file lbrynet extras daemon auth server py line in trap file twisted python failure py line in trap file twisted python failure py line in raiseexception file twisted internet defer py line in adapt file lbrynet extras daemon daemon py line in maybe paginate file lbrynet extras wallet manager py line in get history builtins assertionerror invalid claim update state expected to find previous claim in input message invalid claim update state expected to find previous claim in input system configuration for the app this info is in the about section at the bottom of the help page you can include a screenshot instead of typing it out for the daemon run curl data method version and include the full output lbry daemon version lbry app version lbry installation id operating system anything else screenshots internal use acceptance criteria definition of done tested against acceptance criteria tested against the assumptions of user story the project builds without errors unit tests are written and passing tests on devices browsers listed in the issue have passed qa performed issues resolved refactoring completed any configuration or build changes documented documentation updated peer code review performed
| 0
|
14,759
| 18,041,264,077
|
IssuesEvent
|
2021-09-18 04:33:11
|
ooi-data/CE04OSPD-DP01B-06-DOSTAD105-recovered_inst-dpc_optode_instrument_recovered
|
https://api.github.com/repos/ooi-data/CE04OSPD-DP01B-06-DOSTAD105-recovered_inst-dpc_optode_instrument_recovered
|
opened
|
🛑 Processing failed: ResponseParserError
|
process
|
## Overview
`ResponseParserError` found in `processing_task` task during run ended on 2021-09-18T04:33:10.999228.
## Details
Flow name: `CE04OSPD-DP01B-06-DOSTAD105-recovered_inst-dpc_optode_instrument_recovered`
Task name: `processing_task`
Error type: `ResponseParserError`
Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom
root = parser.close()
xml.etree.ElementTree.ParseError: no element found: line 2, column 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 101, in processing
final_path = finalize_zarr(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 359, in finalize_zarr
source_store.fs.delete(source_store.root, recursive=True)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1187, in delete
return self.rm(path, recursive=recursive, maxdepth=maxdepth)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 88, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 69, in sync
raise result[0]
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 25, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1677, in _rm
await asyncio.gather(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1657, in _bulk_delete
await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 268, in _call_s3
raise err
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 248, in _call_s3
out = await method(**additional_kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 141, in _make_api_call
http, parsed_response = await self._make_request(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 161, in _make_request
return await self._endpoint.make_request(operation_model, request_dict)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 93, in _send_request
success_response, exception = await self._get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 112, in _get_response
success_response, exception = await self._do_get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 177, in _do_get_response
parsed_response = parser.parse(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse
parsed = self._do_parse(response, shape)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 813, in _do_parse
self._add_modeled_parse(response, shape, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 822, in _add_modeled_parse
self._parse_payload(response, shape, member_shapes, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 862, in _parse_payload
original_parsed = self._initial_body_parse(response['body'])
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 948, in _initial_body_parse
return self._parse_xml_string_to_dom(xml_string)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom
raise ResponseParserError(
botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
```
</details>
|
1.0
|
🛑 Processing failed: ResponseParserError - ## Overview
`ResponseParserError` found in `processing_task` task during run ended on 2021-09-18T04:33:10.999228.
## Details
Flow name: `CE04OSPD-DP01B-06-DOSTAD105-recovered_inst-dpc_optode_instrument_recovered`
Task name: `processing_task`
Error type: `ResponseParserError`
Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom
root = parser.close()
xml.etree.ElementTree.ParseError: no element found: line 2, column 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 101, in processing
final_path = finalize_zarr(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 359, in finalize_zarr
source_store.fs.delete(source_store.root, recursive=True)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1187, in delete
return self.rm(path, recursive=recursive, maxdepth=maxdepth)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 88, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 69, in sync
raise result[0]
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 25, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1677, in _rm
await asyncio.gather(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1657, in _bulk_delete
await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 268, in _call_s3
raise err
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 248, in _call_s3
out = await method(**additional_kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 141, in _make_api_call
http, parsed_response = await self._make_request(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 161, in _make_request
return await self._endpoint.make_request(operation_model, request_dict)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 93, in _send_request
success_response, exception = await self._get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 112, in _get_response
success_response, exception = await self._do_get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 177, in _do_get_response
parsed_response = parser.parse(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse
parsed = self._do_parse(response, shape)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 813, in _do_parse
self._add_modeled_parse(response, shape, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 822, in _add_modeled_parse
self._parse_payload(response, shape, member_shapes, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 862, in _parse_payload
original_parsed = self._initial_body_parse(response['body'])
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 948, in _initial_body_parse
return self._parse_xml_string_to_dom(xml_string)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom
raise ResponseParserError(
botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
```
</details>
|
process
|
🛑 processing failed responseparsererror overview responseparsererror found in processing task task during run ended on details flow name recovered inst dpc optode instrument recovered task name processing task error type responseparsererror error message unable to parse response no element found line column invalid xml received further retries may succeed b n traceback traceback most recent call last file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom root parser close xml etree elementtree parseerror no element found line column during handling of the above exception another exception occurred traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize zarr file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize zarr source store fs delete source store root recursive true file srv conda envs notebook lib site packages fsspec spec py line in delete return self rm path recursive recursive maxdepth maxdepth file srv conda envs notebook lib site packages fsspec asyn py line in wrapper return sync self loop func args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in sync raise result file srv conda envs notebook lib site packages fsspec asyn py line in runner result await coro file srv conda envs notebook lib site packages core py line in rm await asyncio gather file srv conda envs notebook lib site packages core py line in bulk delete await self call delete objects kwargs bucket bucket delete delete keys file srv conda envs notebook lib site packages core py line in call raise err file srv conda envs notebook lib site packages core py line in call out await method additional kwargs file srv conda envs notebook lib site packages aiobotocore client py line in make api call http parsed response await self make request file srv conda envs notebook lib site packages aiobotocore client py line in make request return await self endpoint make request operation model request dict file srv conda envs notebook lib site packages aiobotocore endpoint py line in send request success response exception await self get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in get response success response exception await self do get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in do get response parsed response parser parse file srv conda envs notebook lib site packages botocore parsers py line in parse parsed self do parse response shape file srv conda envs notebook lib site packages botocore parsers py line in do parse self add modeled parse response shape final parsed file srv conda envs notebook lib site packages botocore parsers py line in add modeled parse self parse payload response shape member shapes final parsed file srv conda envs notebook lib site packages botocore parsers py line in parse payload original parsed self initial body parse response file srv conda envs notebook lib site packages botocore parsers py line in initial body parse return self parse xml string to dom xml string file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom raise responseparsererror botocore parsers responseparsererror unable to parse response no element found line column invalid xml received further retries may succeed b n
| 1
|
18,303
| 24,416,212,849
|
IssuesEvent
|
2022-10-05 16:06:00
|
UserOfficeProject/user-office-project-issue-tracker
|
https://api.github.com/repos/UserOfficeProject/user-office-project-issue-tracker
|
closed
|
Plan support for ISIS Direct deadline
|
type: process area: uop/stfc
|
**External** Closes on the 19th October
**Internal** Closes on the 24th October
We should have a person dedicated to 2nd line support, on top of our normal 2nd line support, dedicated to proposal submission on weekdays between 12th and 25th October.
I'm away 10th-21st October so I need to make sure other team and group leaders are around to support with any comms and crisis management - "1st and 2nd line escalation".
An addition to normal support, we should provide inbox and service monitoring over the weekend before the proposal deadline - "1st line". This won't be standard practice in the future.
### Support rota
| Oct date <br> Weekday | 12<br>W | 13<br>T | 14<br>F | _15<br>S_ | _16<br>S_ | 17<br>M | 18<br>T | 19<br>W | 20<br>T | 21<br>F | _22<br>S_ | _23<br>S_ | 24<br>M | 25<br>T
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | ---
| 1st line | GF | GF | GF | EH | EH | EH | EH | EH | GF | GF | _N/A_ | _N/A_ | GF | GF
| Dedicated proposal | SF/RK | SF/RK | SF/RK | _N/A_ | _N/A_ | SF/RK | SF/RK | SF/RK | SF/RK | SF/RK | _N/A_ | _N/A_ | SF/RK | SF/RK
| 1st level escalation | GO | GO | GO | _N/A_ | _N/A_ | GO | GO | GO | GO | GO | _N/A_ | _N/A_ | PR | PR
| 2nd level escalation | SC | SC | SC | _N/A_ | _N/A_ | SC | SC | SH | SH | SH | _N/A_ | _N/A_ | SH | SH
GF: George Flecknell
EH: Edward Haynes
SF: Simon Fernandes
RK: Rasmia Kulan
GO: Gbenga Omirinde
SC: Sonia Conway
SH: Simon Hodder
### Out of hours support responsibilities
- Handle 1st line issues and any 2nd line issues where possible
- Put up messages if there's a significant outage:
- Across all apps
- On the old proposal submission homepage
- On the new proposal submission dashboard (if possible/applicable)
|
1.0
|
Plan support for ISIS Direct deadline - **External** Closes on the 19th October
**Internal** Closes on the 24th October
We should have a person dedicated to 2nd line support, on top of our normal 2nd line support, dedicated to proposal submission on weekdays between 12th and 25th October.
I'm away 10th-21st October so I need to make sure other team and group leaders are around to support with any comms and crisis management - "1st and 2nd line escalation".
An addition to normal support, we should provide inbox and service monitoring over the weekend before the proposal deadline - "1st line". This won't be standard practice in the future.
### Support rota
| Oct date <br> Weekday | 12<br>W | 13<br>T | 14<br>F | _15<br>S_ | _16<br>S_ | 17<br>M | 18<br>T | 19<br>W | 20<br>T | 21<br>F | _22<br>S_ | _23<br>S_ | 24<br>M | 25<br>T
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | ---
| 1st line | GF | GF | GF | EH | EH | EH | EH | EH | GF | GF | _N/A_ | _N/A_ | GF | GF
| Dedicated proposal | SF/RK | SF/RK | SF/RK | _N/A_ | _N/A_ | SF/RK | SF/RK | SF/RK | SF/RK | SF/RK | _N/A_ | _N/A_ | SF/RK | SF/RK
| 1st level escalation | GO | GO | GO | _N/A_ | _N/A_ | GO | GO | GO | GO | GO | _N/A_ | _N/A_ | PR | PR
| 2nd level escalation | SC | SC | SC | _N/A_ | _N/A_ | SC | SC | SH | SH | SH | _N/A_ | _N/A_ | SH | SH
GF: George Flecknell
EH: Edward Haynes
SF: Simon Fernandes
RK: Rasmia Kulan
GO: Gbenga Omirinde
SC: Sonia Conway
SH: Simon Hodder
### Out of hours support responsibilities
- Handle 1st line issues and any 2nd line issues where possible
- Put up messages if there's a significant outage:
- Across all apps
- On the old proposal submission homepage
- On the new proposal submission dashboard (if possible/applicable)
|
process
|
plan support for isis direct deadline external closes on the october internal closes on the october we should have a person dedicated to line support on top of our normal line support dedicated to proposal submission on weekdays between and october i m away october so i need to make sure other team and group leaders are around to support with any comms and crisis management and line escalation an addition to normal support we should provide inbox and service monitoring over the weekend before the proposal deadline line this won t be standard practice in the future support rota oct date weekday w t f s s m t w t f s s m t line gf gf gf eh eh eh eh eh gf gf n a n a gf gf dedicated proposal sf rk sf rk sf rk n a n a sf rk sf rk sf rk sf rk sf rk n a n a sf rk sf rk level escalation go go go n a n a go go go go go n a n a pr pr level escalation sc sc sc n a n a sc sc sh sh sh n a n a sh sh gf george flecknell eh edward haynes sf simon fernandes rk rasmia kulan go gbenga omirinde sc sonia conway sh simon hodder out of hours support responsibilities handle line issues and any line issues where possible put up messages if there s a significant outage across all apps on the old proposal submission homepage on the new proposal submission dashboard if possible applicable
| 1
|
20,268
| 26,894,540,850
|
IssuesEvent
|
2023-02-06 11:23:15
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
reopened
|
[C++] Nightly Integration Testing Report
|
type: process nightly-testing
|
Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1178)**
***
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @DellaBitta on commit 5c0eebe6cdffa6007bd82cfac606c515ef9abb94
Last updated: Sun Feb 5 02:38 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4095824601)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 5c0eebe6cdffa6007bd82cfac606c515ef9abb94
Last updated: Sun Feb 5 05:19 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4096493964)**
<hidden value="integration-test-status-comment"></hidden>
|
1.0
|
[C++] Nightly Integration Testing Report - Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1178)**
***
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @DellaBitta on commit 5c0eebe6cdffa6007bd82cfac606c515ef9abb94
Last updated: Sun Feb 5 02:38 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4095824601)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 5c0eebe6cdffa6007bd82cfac606c515ef9abb94
Last updated: Sun Feb 5 05:19 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4096493964)**
<hidden value="integration-test-status-comment"></hidden>
|
process
|
nightly integration testing report note this report excludes firestore please also check ✅ nbsp integration test succeeded requested by dellabitta on commit last updated sun feb pst ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated sun feb pst
| 1
|
532,032
| 15,528,880,545
|
IssuesEvent
|
2021-03-13 12:57:09
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
[Coverity CID :219509] Side effect in assertion in tests/net/socket/tcp/src/main.c
|
Coverity bug priority: low
|
Static code scan issues found in file:
https://github.com/zephyrproject-rtos/zephyr/tree/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/tests/net/socket/tcp/src/main.c
Category: Incorrect expression
Function: `test_socket_permission`
Component: Tests
CID: [219509](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=219509)
Details:
https://github.com/zephyrproject-rtos/zephyr/blob/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/tests/net/socket/tcp/src/main.c#L753
Please fix or provide comments in coverity using the link:
https://scan9.coverity.com/reports.htm#v32951/p12996.
Note: This issue was created automatically. Priority was set based on classification
of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
|
1.0
|
[Coverity CID :219509] Side effect in assertion in tests/net/socket/tcp/src/main.c -
Static code scan issues found in file:
https://github.com/zephyrproject-rtos/zephyr/tree/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/tests/net/socket/tcp/src/main.c
Category: Incorrect expression
Function: `test_socket_permission`
Component: Tests
CID: [219509](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=219509)
Details:
https://github.com/zephyrproject-rtos/zephyr/blob/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/tests/net/socket/tcp/src/main.c#L753
Please fix or provide comments in coverity using the link:
https://scan9.coverity.com/reports.htm#v32951/p12996.
Note: This issue was created automatically. Priority was set based on classification
of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
|
non_process
|
side effect in assertion in tests net socket tcp src main c static code scan issues found in file category incorrect expression function test socket permission component tests cid details please fix or provide comments in coverity using the link note this issue was created automatically priority was set based on classification of the file affected and the impact field in coverity assignees were set using the codeowners file
| 0
|
168,806
| 26,701,278,780
|
IssuesEvent
|
2023-01-27 14:35:19
|
microsoft/fluentui
|
https://api.github.com/repos/microsoft/fluentui
|
closed
|
[Bug]: Pressing Tab key closes the picker (both floating and base picker)
|
Resolution: By Design Fluent UI react (v8)
|
### Library
React / v8 (@fluentui/react)
### System Info
```shell
System:
OS: Windows 10 10.0.19044
CPU: (12) x64 Intel(R) Xeon(R) W-2133 CPU @ 3.60GHz
Memory: 34.70 GB / 63.66 GB
Browsers:
Chrome: 106.0.5249.119
Edge: Spartan (44.19041.1266.0), Chromium (106.0.1370.47)
Internet Explorer: 11.0.19041.1566
```
### Are you reporting Accessibility issue?
yes
### Reproduction
https://developer.microsoft.com/en-us/fluentui#/controls/web/pickers
### Bug Description
This is specially important, if we have put interactive elements in footer of picker using renderFooter callback.
## Actual Behavior
When we press TAB key, it closes the picker
## Expected Behavior
It should put focus on the next available focusable element
## WorkAround
For our use case we show create topic/ give feedback buttons as footer, however due to tab key event
functioning same as the enter key, it makes the footer buttons inaccessible from keyboard. Hence copied
the onKeyDown event from the BaseFloatingPicker class and removed the tab key case from the switch.
### Logs
_No response_
### Requested priority
Normal
### Products/sites affected
Topic Picker
### Are you willing to submit a PR to fix?
yes
### Validations
- [X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
- [X] The provided reproduction is a minimal reproducible example of the bug.
|
1.0
|
[Bug]: Pressing Tab key closes the picker (both floating and base picker) - ### Library
React / v8 (@fluentui/react)
### System Info
```shell
System:
OS: Windows 10 10.0.19044
CPU: (12) x64 Intel(R) Xeon(R) W-2133 CPU @ 3.60GHz
Memory: 34.70 GB / 63.66 GB
Browsers:
Chrome: 106.0.5249.119
Edge: Spartan (44.19041.1266.0), Chromium (106.0.1370.47)
Internet Explorer: 11.0.19041.1566
```
### Are you reporting Accessibility issue?
yes
### Reproduction
https://developer.microsoft.com/en-us/fluentui#/controls/web/pickers
### Bug Description
This is specially important, if we have put interactive elements in footer of picker using renderFooter callback.
## Actual Behavior
When we press TAB key, it closes the picker
## Expected Behavior
It should put focus on the next available focusable element
## WorkAround
For our use case we show create topic/ give feedback buttons as footer, however due to tab key event
functioning same as the enter key, it makes the footer buttons inaccessible from keyboard. Hence copied
the onKeyDown event from the BaseFloatingPicker class and removed the tab key case from the switch.
### Logs
_No response_
### Requested priority
Normal
### Products/sites affected
Topic Picker
### Are you willing to submit a PR to fix?
yes
### Validations
- [X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
- [X] The provided reproduction is a minimal reproducible example of the bug.
|
non_process
|
pressing tab key closes the picker both floating and base picker library react fluentui react system info shell system os windows cpu intel r xeon r w cpu memory gb gb browsers chrome edge spartan chromium internet explorer are you reporting accessibility issue yes reproduction bug description this is specially important if we have put interactive elements in footer of picker using renderfooter callback actual behavior when we press tab key it closes the picker expected behavior it should put focus on the next available focusable element workaround for our use case we show create topic give feedback buttons as footer however due to tab key event functioning same as the enter key it makes the footer buttons inaccessible from keyboard hence copied the onkeydown event from the basefloatingpicker class and removed the tab key case from the switch logs no response requested priority normal products sites affected topic picker are you willing to submit a pr to fix yes validations check that there isn t already an issue that reports the same bug to avoid creating a duplicate the provided reproduction is a minimal reproducible example of the bug
| 0
|
10,012
| 13,043,880,567
|
IssuesEvent
|
2020-07-29 02:56:13
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `LastDay` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `LastDay` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @lonng
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `LastDay` from TiDB -
## Description
Port the scalar function `LastDay` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @lonng
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function lastday from tidb description port the scalar function lastday from tidb to coprocessor score mentor s lonng recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
22,597
| 31,818,785,249
|
IssuesEvent
|
2023-09-13 23:18:17
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
@truffle/dashboard 0.4.5 has 2 guarddog issues
|
npm-install-script npm-silent-process-execution
|
```{"npm-install-script":[{"code":" \"prepare\": \"yarn build\",","location":"package/package.json:27","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" const child = (0, child_process_1.spawn)(\"node\", [dashboardPath, optionsBase64], {\n detached: true,\n stdio: \"ignore\"\n });","location":"package/dist/lib/utils.js:13","message":"This package is silently executing another executable"}]}```
|
1.0
|
@truffle/dashboard 0.4.5 has 2 guarddog issues - ```{"npm-install-script":[{"code":" \"prepare\": \"yarn build\",","location":"package/package.json:27","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" const child = (0, child_process_1.spawn)(\"node\", [dashboardPath, optionsBase64], {\n detached: true,\n stdio: \"ignore\"\n });","location":"package/dist/lib/utils.js:13","message":"This package is silently executing another executable"}]}```
|
process
|
truffle dashboard has guarddog issues npm install script npm silent process execution n detached true n stdio ignore n location package dist lib utils js message this package is silently executing another executable
| 1
|
9,508
| 12,495,500,638
|
IssuesEvent
|
2020-06-01 13:18:47
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
How does Post-Processor 'vsphere' and 'vsphere-template' work?
|
post-processor/vsphere post-processor/vsphere-template question waiting-reply
|
Hello,
I created a template for packer to create a VM and install Fedora CoreOS on it. Once the VM is prepared, I run some post-processors that includes a script to add a custom template to the VM and convert the off files to an ova.
However, before that happens, I want to run the vsphere and vsphere-template processors so that I can have a copy in my vCenter.
My current config for this is:
```
"post-processors": [
{
"type": "shell-local",
"scripts": [
"files/scripts/xml-ip-ovf.sh"
]
},
[
{
"type": "vsphere",
"host": "{{user `vServer`}}",
"username": "{{user `vUser`}}",
"password": "{{user `vPass`}}",
"insecure": "true",
"datacenter": "{{user `vDatacenter`}}",
"cluster": "{{user `vCluster`}}",
"vm_name": "ASAA_vCenter_{{user `version`}}",
"vm_folder": "{{user `vFolder`}}",
"keep_input_artifact": true,
"overwrite": true
},
{
"type": "vsphere-template",
"host": "{{user `vServer`}}",
"username": "{{user `vUser`}}",
"password": "{{user `vPass`}}",
"insecure": "true",
"datacenter": "{{user `vDatacenter`}}",
"folder": "/{{user `vFolder`}}",
"keep_input_artifact": true
}
],
{
"type": "shell-local",
"scripts": [
"files/scripts/new-ovf-to-ova.sh"
]
}
]
```
When Post-processor kicks in, I get this error after packer completes
```
* Post-processor failed: Error uploading virtual machine: exit status 1
2020/05/29 23:52:35 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/05/29 23:52:35 machine readable: vsphere-iso,error []string{"1 error(s) occurred:\n\n* Post-processor failed: Error uploading virtual machine: exit status 1\n\n"}
Build 'vsphere-iso' errored: 1 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
==> Builds finished but no artifacts were created.
* Post-processor failed: Error uploading virtual machine: exit status 1
2020/05/29 23:52:35 [INFO] (telemetry) Finalizing.
==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: 1 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
```
I read the documentation and I don't see what I am doing wrong.
Is there a step I'm missing?
|
2.0
|
How does Post-Processor 'vsphere' and 'vsphere-template' work? - Hello,
I created a template for packer to create a VM and install Fedora CoreOS on it. Once the VM is prepared, I run some post-processors that includes a script to add a custom template to the VM and convert the off files to an ova.
However, before that happens, I want to run the vsphere and vsphere-template processors so that I can have a copy in my vCenter.
My current config for this is:
```
"post-processors": [
{
"type": "shell-local",
"scripts": [
"files/scripts/xml-ip-ovf.sh"
]
},
[
{
"type": "vsphere",
"host": "{{user `vServer`}}",
"username": "{{user `vUser`}}",
"password": "{{user `vPass`}}",
"insecure": "true",
"datacenter": "{{user `vDatacenter`}}",
"cluster": "{{user `vCluster`}}",
"vm_name": "ASAA_vCenter_{{user `version`}}",
"vm_folder": "{{user `vFolder`}}",
"keep_input_artifact": true,
"overwrite": true
},
{
"type": "vsphere-template",
"host": "{{user `vServer`}}",
"username": "{{user `vUser`}}",
"password": "{{user `vPass`}}",
"insecure": "true",
"datacenter": "{{user `vDatacenter`}}",
"folder": "/{{user `vFolder`}}",
"keep_input_artifact": true
}
],
{
"type": "shell-local",
"scripts": [
"files/scripts/new-ovf-to-ova.sh"
]
}
]
```
When Post-processor kicks in, I get this error after packer completes
```
* Post-processor failed: Error uploading virtual machine: exit status 1
2020/05/29 23:52:35 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/05/29 23:52:35 machine readable: vsphere-iso,error []string{"1 error(s) occurred:\n\n* Post-processor failed: Error uploading virtual machine: exit status 1\n\n"}
Build 'vsphere-iso' errored: 1 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
==> Builds finished but no artifacts were created.
* Post-processor failed: Error uploading virtual machine: exit status 1
2020/05/29 23:52:35 [INFO] (telemetry) Finalizing.
==> Some builds didn't complete successfully and had errors:
--> vsphere-iso: 1 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
```
I read the documentation and I don't see what I am doing wrong.
Is there a step I'm missing?
|
process
|
how does post processor vsphere and vsphere template work hello i created a template for packer to create a vm and install fedora coreos on it once the vm is prepared i run some post processors that includes a script to add a custom template to the vm and convert the off files to an ova however before that happens i want to run the vsphere and vsphere template processors so that i can have a copy in my vcenter my current config for this is post processors type shell local scripts files scripts xml ip ovf sh type vsphere host user vserver username user vuser password user vpass insecure true datacenter user vdatacenter cluster user vcluster vm name asaa vcenter user version vm folder user vfolder keep input artifact true overwrite true type vsphere template host user vserver username user vuser password user vpass insecure true datacenter user vdatacenter folder user vfolder keep input artifact true type shell local scripts files scripts new ovf to ova sh when post processor kicks in i get this error after packer completes post processor failed error uploading virtual machine exit status machine readable error count string some builds didn t complete successfully and had errors machine readable vsphere iso error string error s occurred n n post processor failed error uploading virtual machine exit status n n build vsphere iso errored error s occurred post processor failed error uploading virtual machine exit status builds finished but no artifacts were created post processor failed error uploading virtual machine exit status telemetry finalizing some builds didn t complete successfully and had errors vsphere iso error s occurred post processor failed error uploading virtual machine exit status i read the documentation and i don t see what i am doing wrong is there a step i m missing
| 1
|
689,584
| 23,625,744,467
|
IssuesEvent
|
2022-08-25 03:35:03
|
TencentBlueKing/bk-user
|
https://api.github.com/repos/TencentBlueKing/bk-user
|
opened
|
[API] 大表查询limit offset 优化
|
Type: enhancement Priority: Middlum Layer: api todo
|
profiles_profile表, 翻页进行全量同步时, 到后面有慢查询
```sql
SELECT ( IF(`domain` = 'default.local', username, Concat(username, '@', domain))
) AS
`username`,
`profiles_profile`.`id`,
`profiles_profile`.username,
`profiles_profile`.`qq`,
`profiles_profile`.`email`,
`profiles_profile`.`telephone`,
`profiles_profile`.`wx_userid`,
`profiles_profile`.`display_name`,
`profiles_profile`.`extras`
FROM `profiles_profile`
WHERE `profiles_profile`.`enabled` = 1
ORDER BY `profiles_profile`.`id` ASC
LIMIT 2000 offset 30000
```
|
1.0
|
[API] 大表查询limit offset 优化 - profiles_profile表, 翻页进行全量同步时, 到后面有慢查询
```sql
SELECT ( IF(`domain` = 'default.local', username, Concat(username, '@', domain))
) AS
`username`,
`profiles_profile`.`id`,
`profiles_profile`.username,
`profiles_profile`.`qq`,
`profiles_profile`.`email`,
`profiles_profile`.`telephone`,
`profiles_profile`.`wx_userid`,
`profiles_profile`.`display_name`,
`profiles_profile`.`extras`
FROM `profiles_profile`
WHERE `profiles_profile`.`enabled` = 1
ORDER BY `profiles_profile`.`id` ASC
LIMIT 2000 offset 30000
```
|
non_process
|
大表查询limit offset 优化 profiles profile表 翻页进行全量同步时 到后面有慢查询 sql select if domain default local username concat username domain as username profiles profile id profiles profile username profiles profile qq profiles profile email profiles profile telephone profiles profile wx userid profiles profile display name profiles profile extras from profiles profile where profiles profile enabled order by profiles profile id asc limit offset
| 0
|
13,289
| 15,765,915,968
|
IssuesEvent
|
2021-03-31 14:34:50
|
threefoldtech/js-sdk
|
https://api.github.com/repos/threefoldtech/js-sdk
|
closed
|
Where is my old cluster, installed Ubuntu 18.04?
|
process_wontfix type_question
|
I bought 3bot masteritua (main net) to 30.03, date expired 23.03 for some reason.
Than, I paid 3bot(masteritua) to 6.04. And I got a new Admin panel, without my installed solution ubuntu 18.04 before
Question:
- Where is my old cluster with ip: 10.100.8.2?
|
1.0
|
Where is my old cluster, installed Ubuntu 18.04? - I bought 3bot masteritua (main net) to 30.03, date expired 23.03 for some reason.
Than, I paid 3bot(masteritua) to 6.04. And I got a new Admin panel, without my installed solution ubuntu 18.04 before
Question:
- Where is my old cluster with ip: 10.100.8.2?
|
process
|
where is my old cluster installed ubuntu i bought masteritua main net to date expired for some reason than i paid masteritua to and i got a new admin panel without my installed solution ubuntu before question where is my old cluster with ip
| 1
|
18,914
| 3,098,645,812
|
IssuesEvent
|
2015-08-28 12:33:42
|
jokeane/bjspell
|
https://api.github.com/repos/jokeane/bjspell
|
closed
|
Generazione file dizionario
|
auto-migrated Priority-Medium Type-Defect
|
```
Salve, ho trovato questa libreria e mi interesserebbe provarla con un
dizionario italiano, però non mi è chiaro come generarlo a partire dai file
dict e affix.
Simone
```
Original issue reported on code.google.com by `quattord...@gmail.com` on 15 Oct 2012 at 6:38
|
1.0
|
Generazione file dizionario - ```
Salve, ho trovato questa libreria e mi interesserebbe provarla con un
dizionario italiano, però non mi è chiaro come generarlo a partire dai file
dict e affix.
Simone
```
Original issue reported on code.google.com by `quattord...@gmail.com` on 15 Oct 2012 at 6:38
|
non_process
|
generazione file dizionario salve ho trovato questa libreria e mi interesserebbe provarla con un dizionario italiano però non mi è chiaro come generarlo a partire dai file dict e affix simone original issue reported on code google com by quattord gmail com on oct at
| 0
|
198,100
| 6,969,927,658
|
IssuesEvent
|
2017-12-11 08:20:05
|
pravega/pravega
|
https://api.github.com/repos/pravega/pravega
|
opened
|
Error in StorageWriter
|
area/server kind/bug priority/P0 status/needs-attention version/0.2.0
|
**Problem description**
We have observed the following exception in the segment store while running some tests:
```
2017-12-07 18:35:51,305 80300529 [segment-store-3] ERROR i.p.s.server.writer.StorageWriter - StorageWriter[2]: Iteration[9376].Error.
java.lang.IllegalArgumentException: startOffset must refer to an offset beyond the Segment's StorageLength offset.
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)
at io.pravega.segmentstore.server.reading.StreamSegmentReadIndex.readDirect(StreamSegmentReadIndex.java:581)
at io.pravega.segmentstore.server.reading.ContainerReadIndex.readDirect(ContainerReadIndex.java:176)
at io.pravega.segmentstore.server.writer.StorageWriterFactory$StorageWriterDataSource.getAppendData(StorageWriterFactory.java:123)
at io.pravega.segmentstore.server.writer.SegmentAggregator.getFlushArgs(SegmentAggregator.java:649)
at io.pravega.segmentstore.server.writer.SegmentAggregator.flushPendingAppends(SegmentAggregator.java:598)
at io.pravega.segmentstore.server.writer.SegmentAggregator.lambda$flushExcess$14(SegmentAggregator.java:579)
at io.pravega.common.concurrent.FutureHelpers$Loop.call(FutureHelpers.java:642)
at io.pravega.common.concurrent.FutureHelpers$Loop.call(FutureHelpers.java:611)
at io.pravega.common.concurrent.FutureHelpers.runOrFail(FutureHelpers.java:524)
at io.pravega.common.concurrent.FutureHelpers$Loop.run(FutureHelpers.java:655)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
```
This error has been found against our 0.1 release, so it is possible that it is not an issue any longer, but I couldn't find a related issue. Unfortunately, we don't have for now any more information that this stack trace for now.
**Problem location**
Segment store.
**Suggestions for an improvement**
Determine whether it is an issue or not and fix in the case it is.
|
1.0
|
Error in StorageWriter - **Problem description**
We have observed the following exception in the segment store while running some tests:
```
2017-12-07 18:35:51,305 80300529 [segment-store-3] ERROR i.p.s.server.writer.StorageWriter - StorageWriter[2]: Iteration[9376].Error.
java.lang.IllegalArgumentException: startOffset must refer to an offset beyond the Segment's StorageLength offset.
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)
at io.pravega.segmentstore.server.reading.StreamSegmentReadIndex.readDirect(StreamSegmentReadIndex.java:581)
at io.pravega.segmentstore.server.reading.ContainerReadIndex.readDirect(ContainerReadIndex.java:176)
at io.pravega.segmentstore.server.writer.StorageWriterFactory$StorageWriterDataSource.getAppendData(StorageWriterFactory.java:123)
at io.pravega.segmentstore.server.writer.SegmentAggregator.getFlushArgs(SegmentAggregator.java:649)
at io.pravega.segmentstore.server.writer.SegmentAggregator.flushPendingAppends(SegmentAggregator.java:598)
at io.pravega.segmentstore.server.writer.SegmentAggregator.lambda$flushExcess$14(SegmentAggregator.java:579)
at io.pravega.common.concurrent.FutureHelpers$Loop.call(FutureHelpers.java:642)
at io.pravega.common.concurrent.FutureHelpers$Loop.call(FutureHelpers.java:611)
at io.pravega.common.concurrent.FutureHelpers.runOrFail(FutureHelpers.java:524)
at io.pravega.common.concurrent.FutureHelpers$Loop.run(FutureHelpers.java:655)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
```
This error has been found against our 0.1 release, so it is possible that it is not an issue any longer, but I couldn't find a related issue. Unfortunately, we don't have for now any more information that this stack trace for now.
**Problem location**
Segment store.
**Suggestions for an improvement**
Determine whether it is an issue or not and fix in the case it is.
|
non_process
|
error in storagewriter problem description we have observed the following exception in the segment store while running some tests error i p s server writer storagewriter storagewriter iteration error java lang illegalargumentexception startoffset must refer to an offset beyond the segment s storagelength offset at com google common base preconditions checkargument preconditions java at io pravega segmentstore server reading streamsegmentreadindex readdirect streamsegmentreadindex java at io pravega segmentstore server reading containerreadindex readdirect containerreadindex java at io pravega segmentstore server writer storagewriterfactory storagewriterdatasource getappenddata storagewriterfactory java at io pravega segmentstore server writer segmentaggregator getflushargs segmentaggregator java at io pravega segmentstore server writer segmentaggregator flushpendingappends segmentaggregator java at io pravega segmentstore server writer segmentaggregator lambda flushexcess segmentaggregator java at io pravega common concurrent futurehelpers loop call futurehelpers java at io pravega common concurrent futurehelpers loop call futurehelpers java at io pravega common concurrent futurehelpers runorfail futurehelpers java at io pravega common concurrent futurehelpers loop run futurehelpers java at java util concurrent executors runnableadapter call executors java at java util concurrent futuretask run futuretask java at java util concurrent scheduledthreadpoolexecutor scheduledfuturetask access scheduledthreadpoolexecutor java at java util concurrent scheduledthreadpoolexecutor scheduledfuturetask run scheduledthreadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java this error has been found against our release so it is possible that it is not an issue any longer but i couldn t find a related issue unfortunately we don t have for now any more information that this stack trace for now problem location segment store suggestions for an improvement determine whether it is an issue or not and fix in the case it is
| 0
|
289,340
| 24,980,995,959
|
IssuesEvent
|
2022-11-02 11:41:16
|
benoitkugler/maths-online
|
https://api.github.com/repos/benoitkugler/maths-online
|
closed
|
[prof] Modification de variantes
|
bug A tester
|
Deux modifications consécutives de variantes dans une même question ne sont pas prises à compte à moins de sortir de la question puis d'y revenir. Il faudra soit le mentionner soit le réguler.
|
1.0
|
[prof] Modification de variantes - Deux modifications consécutives de variantes dans une même question ne sont pas prises à compte à moins de sortir de la question puis d'y revenir. Il faudra soit le mentionner soit le réguler.
|
non_process
|
modification de variantes deux modifications consécutives de variantes dans une même question ne sont pas prises à compte à moins de sortir de la question puis d y revenir il faudra soit le mentionner soit le réguler
| 0
|
2,334
| 5,142,720,164
|
IssuesEvent
|
2017-01-12 14:10:12
|
jimbrown75/Permit-Vision-Enhancements
|
https://api.github.com/repos/jimbrown75/Permit-Vision-Enhancements
|
opened
|
Re-implement limit on number of permits that a holder can hold
|
Further discussion (Shell) Medium Priority Process Related Should Fix
|
Risk Level According to RAM Limitation of Permits per Holder (Max 12 points*)
Low Low 1 pt (12 permits)
Low 2 pts (6 permits)
Medium 4 pts (3 permits)
High 12 pts (1 permit)
* Example: 2 medium risk permits (2 x 4 points = 8) + 2 low risk permits (2 x 2 = 4 points) = 12 points (limit reached)
|
1.0
|
Re-implement limit on number of permits that a holder can hold - Risk Level According to RAM Limitation of Permits per Holder (Max 12 points*)
Low Low 1 pt (12 permits)
Low 2 pts (6 permits)
Medium 4 pts (3 permits)
High 12 pts (1 permit)
* Example: 2 medium risk permits (2 x 4 points = 8) + 2 low risk permits (2 x 2 = 4 points) = 12 points (limit reached)
|
process
|
re implement limit on number of permits that a holder can hold risk level according to ram limitation of permits per holder max points low low pt permits low pts permits medium pts permits high pts permit example medium risk permits x points low risk permits x points points limit reached
| 1
|
217,196
| 24,323,511,307
|
IssuesEvent
|
2022-09-30 12:56:59
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
[API Proposal]: Authenticated ciphers should distigush between mismatched tag and failure
|
api-approved area-System.Security in-pr
|
### Background and motivation
When decrypting authenticated data, `AesCcm`, `AesGcm`, and `ChaCha20Poly1305` throw `CryptographicException` in the case when the tag does not match (indicating that either the key is wrong or the data is inauthentic) and also in the case when "the decryption operation otherwise failed". A program may want to take a different action in these two scenarios but currently cannot.
### API Proposal
```C#
namespace System.Security.Cryptography
{
// Note for discussion: Should this be sealed or unsealed?
public sealed class AuthenticationTagMismatchException : CryptographicException
{
public AuthenticationTagMismatchException();
public AuthenticationTagMismatchException(string? message);
public AuthenticationTagMismatchException(string? message, Exception? innerException);
}
}
```
### API Usage
```C#
// User has provided password, and PBKDF was used to derive key.
using (var aesGcm = new AesGcm(key))
{
try
{
aesGcm.Decrypt(iv, ciphertext, tag, plaintext);
}
catch (AuthenticationTagMismatchException)
{
// Notify user that password was incorrect or data was corrupt.
// Prompt user for password again in case user typo'd password.
}
catch (CryptographicException)
{
// Notify the user that decryption failed.
}
}
```
### Alternative Designs
_No response_
### Risks
_No response_
|
True
|
[API Proposal]: Authenticated ciphers should distigush between mismatched tag and failure - ### Background and motivation
When decrypting authenticated data, `AesCcm`, `AesGcm`, and `ChaCha20Poly1305` throw `CryptographicException` in the case when the tag does not match (indicating that either the key is wrong or the data is inauthentic) and also in the case when "the decryption operation otherwise failed". A program may want to take a different action in these two scenarios but currently cannot.
### API Proposal
```C#
namespace System.Security.Cryptography
{
// Note for discussion: Should this be sealed or unsealed?
public sealed class AuthenticationTagMismatchException : CryptographicException
{
public AuthenticationTagMismatchException();
public AuthenticationTagMismatchException(string? message);
public AuthenticationTagMismatchException(string? message, Exception? innerException);
}
}
```
### API Usage
```C#
// User has provided password, and PBKDF was used to derive key.
using (var aesGcm = new AesGcm(key))
{
try
{
aesGcm.Decrypt(iv, ciphertext, tag, plaintext);
}
catch (AuthenticationTagMismatchException)
{
// Notify user that password was incorrect or data was corrupt.
// Prompt user for password again in case user typo'd password.
}
catch (CryptographicException)
{
// Notify the user that decryption failed.
}
}
```
### Alternative Designs
_No response_
### Risks
_No response_
|
non_process
|
authenticated ciphers should distigush between mismatched tag and failure background and motivation when decrypting authenticated data aesccm aesgcm and throw cryptographicexception in the case when the tag does not match indicating that either the key is wrong or the data is inauthentic and also in the case when the decryption operation otherwise failed a program may want to take a different action in these two scenarios but currently cannot api proposal c namespace system security cryptography note for discussion should this be sealed or unsealed public sealed class authenticationtagmismatchexception cryptographicexception public authenticationtagmismatchexception public authenticationtagmismatchexception string message public authenticationtagmismatchexception string message exception innerexception api usage c user has provided password and pbkdf was used to derive key using var aesgcm new aesgcm key try aesgcm decrypt iv ciphertext tag plaintext catch authenticationtagmismatchexception notify user that password was incorrect or data was corrupt prompt user for password again in case user typo d password catch cryptographicexception notify the user that decryption failed alternative designs no response risks no response
| 0
|
10,372
| 13,190,562,264
|
IssuesEvent
|
2020-08-13 10:25:13
|
pystatgen/sgkit
|
https://api.github.com/repos/pystatgen/sgkit
|
opened
|
Investigate and document when to add deps to precommi mypy's additional_dependencies
|
process + tools
|
Investigate and document when to add deps to precommit mypy's additional_dependencies.
Related: https://github.com/pystatgen/sgkit/pull/106#issuecomment-673392786
|
1.0
|
Investigate and document when to add deps to precommi mypy's additional_dependencies - Investigate and document when to add deps to precommit mypy's additional_dependencies.
Related: https://github.com/pystatgen/sgkit/pull/106#issuecomment-673392786
|
process
|
investigate and document when to add deps to precommi mypy s additional dependencies investigate and document when to add deps to precommit mypy s additional dependencies related
| 1
|
820
| 15,287,914,178
|
IssuesEvent
|
2021-02-23 16:16:52
|
openstates/issues
|
https://api.github.com/repos/openstates/issues
|
closed
|
SD Legislator Spot Check Issues
|
component:people-data type:bug
|
State: South Dakota
Short Description: No data/legislator issue, the only issue is with spacing formatting on Open States.
**Missing or Incorrect legislators:**
n/a
**Data Issues:**
n/a
**Additional Data:**
Spacing for addresses/information for several legislators is incorrect/weird (no spacing between words, examples:
-Troy Heinert: https://openstates.org/person/troy-e-heinert-240YT5iCtH4LyfBCkfzV40/
-Hugh M. Bartels: https://openstates.org/person/hugh-m-bartels-2S221nzXLBRaIBy6N4xkgn/
-Randy Gross: https://sdlegislature.gov/Legislators/Profile/1796/Detail
|
1.0
|
SD Legislator Spot Check Issues - State: South Dakota
Short Description: No data/legislator issue, the only issue is with spacing formatting on Open States.
**Missing or Incorrect legislators:**
n/a
**Data Issues:**
n/a
**Additional Data:**
Spacing for addresses/information for several legislators is incorrect/weird (no spacing between words, examples:
-Troy Heinert: https://openstates.org/person/troy-e-heinert-240YT5iCtH4LyfBCkfzV40/
-Hugh M. Bartels: https://openstates.org/person/hugh-m-bartels-2S221nzXLBRaIBy6N4xkgn/
-Randy Gross: https://sdlegislature.gov/Legislators/Profile/1796/Detail
|
non_process
|
sd legislator spot check issues state south dakota short description no data legislator issue the only issue is with spacing formatting on open states missing or incorrect legislators n a data issues n a additional data spacing for addresses information for several legislators is incorrect weird no spacing between words examples troy heinert hugh m bartels randy gross
| 0
|
19,256
| 25,455,066,892
|
IssuesEvent
|
2022-11-24 13:33:26
|
GoldenGnu/jeveassets
|
https://api.github.com/repos/GoldenGnu/jeveassets
|
closed
|
Show Reprocesses value from a clip board paste
|
enhancement done reprocessing
|
Hi, I do a fair few missions and combat sites and tend to reprocess everything to minerals to either sell to corp or sell on market. In the reprocess tab of the application I thought that I could import the items (and quantity of those items) that are on my clipboard and the application would show me, based on my skills and a market (mainly a trade hub) if it would be more profitable to sell (either at sell or buyers price) or to reprocess.
Having a tool like this that shows me enmass that i could filter to show items to reprocess or items to sell would be greatly beneficial in optimizing the loot I get from these sites rather than manually check each item or just reprocessing everything as a matter of default
GoldenGnu advised that this was not possible in the app at the moment but to submit the idea here on github, so here it is
Regards, V
|
1.0
|
Show Reprocesses value from a clip board paste - Hi, I do a fair few missions and combat sites and tend to reprocess everything to minerals to either sell to corp or sell on market. In the reprocess tab of the application I thought that I could import the items (and quantity of those items) that are on my clipboard and the application would show me, based on my skills and a market (mainly a trade hub) if it would be more profitable to sell (either at sell or buyers price) or to reprocess.
Having a tool like this that shows me enmass that i could filter to show items to reprocess or items to sell would be greatly beneficial in optimizing the loot I get from these sites rather than manually check each item or just reprocessing everything as a matter of default
GoldenGnu advised that this was not possible in the app at the moment but to submit the idea here on github, so here it is
Regards, V
|
process
|
show reprocesses value from a clip board paste hi i do a fair few missions and combat sites and tend to reprocess everything to minerals to either sell to corp or sell on market in the reprocess tab of the application i thought that i could import the items and quantity of those items that are on my clipboard and the application would show me based on my skills and a market mainly a trade hub if it would be more profitable to sell either at sell or buyers price or to reprocess having a tool like this that shows me enmass that i could filter to show items to reprocess or items to sell would be greatly beneficial in optimizing the loot i get from these sites rather than manually check each item or just reprocessing everything as a matter of default goldengnu advised that this was not possible in the app at the moment but to submit the idea here on github so here it is regards v
| 1
|
4,135
| 7,090,906,368
|
IssuesEvent
|
2018-01-12 10:47:55
|
Gepardec/Hogarama
|
https://api.github.com/repos/Gepardec/Hogarama
|
closed
|
Develop commit message guideline
|
process improvement
|
cf. https://chris.beams.io/posts/git-commit/
**DoD:**
- [x] We have a commit message guideline in Wiki.
|
1.0
|
Develop commit message guideline - cf. https://chris.beams.io/posts/git-commit/
**DoD:**
- [x] We have a commit message guideline in Wiki.
|
process
|
develop commit message guideline cf dod we have a commit message guideline in wiki
| 1
|
1,132
| 3,615,792,662
|
IssuesEvent
|
2016-02-07 00:37:07
|
t3kt/vjzual2
|
https://api.github.com/repos/t3kt/vjzual2
|
closed
|
advanced color adjustment module
|
enhancement video processing
|
black level, brightness, gamma, contrast, high/low rgb ranges, etc.
like the one in the original vjzual
|
1.0
|
advanced color adjustment module - black level, brightness, gamma, contrast, high/low rgb ranges, etc.
like the one in the original vjzual
|
process
|
advanced color adjustment module black level brightness gamma contrast high low rgb ranges etc like the one in the original vjzual
| 1
|
822,520
| 30,876,143,656
|
IssuesEvent
|
2023-08-03 14:24:36
|
zowe/zowe-explorer-intellij
|
https://api.github.com/repos/zowe/zowe-explorer-intellij
|
opened
|
Incorrect behaviour of TSO CLI.
|
bug priority-medium severity-medium
|
<!--
Before opening a new issue, please search for the related cases in the existing ones: https://github.com/zowe/zowe-explorer-intellij/issues?q=is%3Aissue
If there is nothing that can help you, feel free to open the bug. Anyway, we will help you to resolve the problem.
-->
I can't normally execute RACF commands in TSO CLI. When opening the console, the first command seems to be executed, while at first I get the message Ready, and then the command seems to be executed a second time (without any action from my side), and, of course, it fails. For example, this is the show case for DELUSER command.

Firstly I got message Ready, that identifies successful execution of first command, but than I got INVALID USERID. And I think it is because of the fact that the plugin tries to send same command second time, and it fails, because the user is no longer there (a similar problem for ADDUSER). And when you try to execute commands further (not even RACF commands), then nothing works anymore, as if it sends a message to enter a new operand, and it constantly starts saying INVALID USERID.

<!-- A clear and concise description of the bug. -->
**Steps To Reproduce**
1. Create TSO console;
2. Execute ADDUSER TSTUSER1;

3. Try to send any more command.

<!--
Steps to reproduce the behavior example:
1. Go to '...'
5. Click on '....'
6. Scroll down to '....'
7. See error
-->
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**IntelliJ IDE Log File**
<!--
To gather the log file, in IntelliJ IDE click *Help* -> *Show Log in Explorer*.
The *idea.log* file is the file we need.
-->
**Screenshots**
<!-- If applicable, add screenshots to help explain your problem. -->
**The Setup**
- OS:
- Zowe Explorer IntelliJ Plug-in Version:
- IntelliJ IDE Version (*Help* -> *About*, screenshot is applicable):
- (Optional) Zowe Kotlin SDK Version:
**Additional context**
<!-- Add any other context about the problem here. -->
|
1.0
|
Incorrect behaviour of TSO CLI. - <!--
Before opening a new issue, please search for the related cases in the existing ones: https://github.com/zowe/zowe-explorer-intellij/issues?q=is%3Aissue
If there is nothing that can help you, feel free to open the bug. Anyway, we will help you to resolve the problem.
-->
I can't normally execute RACF commands in TSO CLI. When opening the console, the first command seems to be executed, while at first I get the message Ready, and then the command seems to be executed a second time (without any action from my side), and, of course, it fails. For example, this is the show case for DELUSER command.

Firstly I got message Ready, that identifies successful execution of first command, but than I got INVALID USERID. And I think it is because of the fact that the plugin tries to send same command second time, and it fails, because the user is no longer there (a similar problem for ADDUSER). And when you try to execute commands further (not even RACF commands), then nothing works anymore, as if it sends a message to enter a new operand, and it constantly starts saying INVALID USERID.

<!-- A clear and concise description of the bug. -->
**Steps To Reproduce**
1. Create TSO console;
2. Execute ADDUSER TSTUSER1;

3. Try to send any more command.

<!--
Steps to reproduce the behavior example:
1. Go to '...'
5. Click on '....'
6. Scroll down to '....'
7. See error
-->
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**IntelliJ IDE Log File**
<!--
To gather the log file, in IntelliJ IDE click *Help* -> *Show Log in Explorer*.
The *idea.log* file is the file we need.
-->
**Screenshots**
<!-- If applicable, add screenshots to help explain your problem. -->
**The Setup**
- OS:
- Zowe Explorer IntelliJ Plug-in Version:
- IntelliJ IDE Version (*Help* -> *About*, screenshot is applicable):
- (Optional) Zowe Kotlin SDK Version:
**Additional context**
<!-- Add any other context about the problem here. -->
|
non_process
|
incorrect behaviour of tso cli before opening a new issue please search for the related cases in the existing ones if there is nothing that can help you feel free to open the bug anyway we will help you to resolve the problem i can t normally execute racf commands in tso cli when opening the console the first command seems to be executed while at first i get the message ready and then the command seems to be executed a second time without any action from my side and of course it fails for example this is the show case for deluser command firstly i got message ready that identifies successful execution of first command but than i got invalid userid and i think it is because of the fact that the plugin tries to send same command second time and it fails because the user is no longer there a similar problem for adduser and when you try to execute commands further not even racf commands then nothing works anymore as if it sends a message to enter a new operand and it constantly starts saying invalid userid steps to reproduce create tso console execute adduser try to send any more command steps to reproduce the behavior example go to click on scroll down to see error expected behavior intellij ide log file to gather the log file in intellij ide click help show log in explorer the idea log file is the file we need screenshots the setup os zowe explorer intellij plug in version intellij ide version help about screenshot is applicable optional zowe kotlin sdk version additional context
| 0
|
62,440
| 3,185,503,064
|
IssuesEvent
|
2015-09-28 05:33:17
|
HellscreamWoW/Tracker
|
https://api.github.com/repos/HellscreamWoW/Tracker
|
closed
|
Mistweaver Monks not regening mana
|
Priority-High Type-Backend
|
i just reached level 10 and my mist weaver monk got mana added as a thing, but she won't regen. Even if i drink water or other mana Regen items
|
1.0
|
Mistweaver Monks not regening mana - i just reached level 10 and my mist weaver monk got mana added as a thing, but she won't regen. Even if i drink water or other mana Regen items
|
non_process
|
mistweaver monks not regening mana i just reached level and my mist weaver monk got mana added as a thing but she won t regen even if i drink water or other mana regen items
| 0
|
2,893
| 5,872,917,417
|
IssuesEvent
|
2017-05-15 12:51:43
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
After search not always trigerred
|
inprocess
|
Hi.
We've been using react-bootstrap-table in three tabs on the same page in our project.
When we wanted to add search that will trigger filtering of all three tables at the same time we added references to the tables ,added a separate text input and set its 'onChange' to call the tables 'handleSearch'.
We also set a 'afterSearch' method to all tables that updates the state according to the search result (we use it to display a summary at the top of each tab).
The search works very well in general.
The only problem is that every 30 seconds or so we perform a background update of the data, and sometimes as a result the data for the tables change.
The change is reflected as expected in the tables (an item that was in one table moves to the other as it should), however the 'afterSearch' is not called and as a result the summary we display is now incorrect.
We used version 2.3.4 by I updated to 4.0.0-beta.1 and it didn't help.
Thanks :)
|
1.0
|
After search not always trigerred - Hi.
We've been using react-bootstrap-table in three tabs on the same page in our project.
When we wanted to add search that will trigger filtering of all three tables at the same time we added references to the tables ,added a separate text input and set its 'onChange' to call the tables 'handleSearch'.
We also set a 'afterSearch' method to all tables that updates the state according to the search result (we use it to display a summary at the top of each tab).
The search works very well in general.
The only problem is that every 30 seconds or so we perform a background update of the data, and sometimes as a result the data for the tables change.
The change is reflected as expected in the tables (an item that was in one table moves to the other as it should), however the 'afterSearch' is not called and as a result the summary we display is now incorrect.
We used version 2.3.4 by I updated to 4.0.0-beta.1 and it didn't help.
Thanks :)
|
process
|
after search not always trigerred hi we ve been using react bootstrap table in three tabs on the same page in our project when we wanted to add search that will trigger filtering of all three tables at the same time we added references to the tables added a separate text input and set its onchange to call the tables handlesearch we also set a aftersearch method to all tables that updates the state according to the search result we use it to display a summary at the top of each tab the search works very well in general the only problem is that every seconds or so we perform a background update of the data and sometimes as a result the data for the tables change the change is reflected as expected in the tables an item that was in one table moves to the other as it should however the aftersearch is not called and as a result the summary we display is now incorrect we used version by i updated to beta and it didn t help thanks
| 1
|
6,986
| 10,132,150,591
|
IssuesEvent
|
2019-08-01 21:29:14
|
spring-projects/spring-hateoas
|
https://api.github.com/repos/spring-projects/spring-hateoas
|
closed
|
Use List not Collection when asking for beans
|
in: configuration process: waiting for review
|
`List` will honor Spring’s `@Order` while `Collection` will not.
|
1.0
|
Use List not Collection when asking for beans - `List` will honor Spring’s `@Order` while `Collection` will not.
|
process
|
use list not collection when asking for beans list will honor spring’s order while collection will not
| 1
|
3,926
| 6,845,902,447
|
IssuesEvent
|
2017-11-13 10:03:06
|
openvstorage/openvstorage-health-check
|
https://api.github.com/repos/openvstorage/openvstorage-health-check
|
closed
|
Healthcheck spews untrue timeouts in in proxy-test
|
process_duplicate type_bug
|
### Problem description
List-ns-osds will not have all osd states as 'Active' due to unreachable osds causing the healthcheck to spew the timeout when its not needed
### Proposed solution
Check the amount of osds needed to satisfy the preset (which was why we implemented this part) and then continue. If the preset cannot be satisfied -> raise
|
1.0
|
Healthcheck spews untrue timeouts in in proxy-test - ### Problem description
List-ns-osds will not have all osd states as 'Active' due to unreachable osds causing the healthcheck to spew the timeout when its not needed
### Proposed solution
Check the amount of osds needed to satisfy the preset (which was why we implemented this part) and then continue. If the preset cannot be satisfied -> raise
|
process
|
healthcheck spews untrue timeouts in in proxy test problem description list ns osds will not have all osd states as active due to unreachable osds causing the healthcheck to spew the timeout when its not needed proposed solution check the amount of osds needed to satisfy the preset which was why we implemented this part and then continue if the preset cannot be satisfied raise
| 1
|
22,255
| 30,803,325,545
|
IssuesEvent
|
2023-08-01 04:32:29
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pih 1.48026 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pih
https://inspector.pypi.io/project/pih
```{
"dependency": "pih",
"version": "1.48026",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid",
"silent-process-execution": [
{
"location": "pih-1.48026/pih/tools.py:774",
"code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpk4ot4_pc/pih"
}
}```
|
1.0
|
pih 1.48026 has 2 GuardDog issues - https://pypi.org/project/pih
https://inspector.pypi.io/project/pih
```{
"dependency": "pih",
"version": "1.48026",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid",
"silent-process-execution": [
{
"location": "pih-1.48026/pih/tools.py:774",
"code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpk4ot4_pc/pih"
}
}```
|
process
|
pih has guarddog issues dependency pih version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pip pid silent process execution location pih pih tools py code result subprocess run command stdin subprocess devnull stdout subprocess devnull stderr subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp pc pih
| 1
|
16,565
| 21,577,755,310
|
IssuesEvent
|
2022-05-02 15:22:24
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
DueDateTimeChecker will block progress if many timers are due
|
kind/bug scope/broker severity/high area/reliability team/process-automation
|
**Describe the bug**
If there are many due timers to be triggered, 'DueDateTimeChecker` will iterate over them. During this time, all progress is blocked for this partition.
|
1.0
|
DueDateTimeChecker will block progress if many timers are due - **Describe the bug**
If there are many due timers to be triggered, 'DueDateTimeChecker` will iterate over them. During this time, all progress is blocked for this partition.
|
process
|
duedatetimechecker will block progress if many timers are due describe the bug if there are many due timers to be triggered duedatetimechecker will iterate over them during this time all progress is blocked for this partition
| 1
|
830
| 3,277,709,654
|
IssuesEvent
|
2015-10-27 02:58:22
|
Funwayguy/InfiniteInvo
|
https://api.github.com/repos/Funwayguy/InfiniteInvo
|
closed
|
Does not work with CrazyCraft Modpack
|
compatibility wontfix
|
---- Minecraft Crash Report ----
// Surprise! Haha. Well, this is awkward.
Time: 10/26/15 9:48 PM
Description: Unexpected error
java.lang.RuntimeException: Player inventory has been replaced with class infiniteinvo.inventory.BigInventoryPlayer
at mods.battlegear2.BattlemodeHookContainerClass.onEntityJoin(BattlemodeHookContainerClass.java:62)
at cpw.mods.fml.common.eventhandler.ASMEventHandler_156_BattlemodeHookContainerClass_onEntityJoin_EntityJoinWorldEvent.invoke(.dynamic)
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54)
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140)
at net.minecraft.world.World.func_72838_d(World.java:1334)
at net.minecraft.client.multiplayer.WorldClient.func_72838_d(WorldClient.java:159)
at net.minecraft.client.Minecraft.func_71353_a(Unknown Source)
at net.minecraft.client.Minecraft.func_71403_a(Unknown Source)
at com.mrnobody.morecommands.patch.NetHandlerPlayClient.func_147282_a(NetHandlerPlayClient.java:76)
at com.mumfrey.liteloader.client.PacketEventsClient.handlePacket(PacketEventsClient.java:106)
at com.mumfrey.liteloader.core.PacketEvents.handlePacketEvent(PacketEvents.java:179)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:134)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:129)
at com.mumfrey.liteloader.core.event.EventProxy$2.$event00046(Unknown Source)
at net.minecraft.network.play.server.S01PacketJoinGame.func_148833_a(SourceFile)
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:212)
at net.minecraft.client.Minecraft.func_71407_l(Unknown Source)
at net.minecraft.client.Minecraft.func_71411_J(Unknown Source)
at net.minecraft.client.Minecraft.func_99999_d(Unknown Source)
at net.minecraft.client.main.Main.main(SourceFile:148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at net.minecraft.launchwrapper.Launch.launch(Launch.java:135)
at net.minecraft.launchwrapper.Launch.main(Launch.java:28)
A detailed walkthrough of the error, its code path and all known details is as follows:
---------------------------------------------------------------------------------------
-- Head --
Stacktrace:
at mods.battlegear2.BattlemodeHookContainerClass.onEntityJoin(BattlemodeHookContainerClass.java:62)
at cpw.mods.fml.common.eventhandler.ASMEventHandler_156_BattlemodeHookContainerClass_onEntityJoin_EntityJoinWorldEvent.invoke(.dynamic)
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54)
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140)
at net.minecraft.world.World.func_72838_d(World.java:1334)
at net.minecraft.client.multiplayer.WorldClient.func_72838_d(WorldClient.java:159)
at net.minecraft.client.Minecraft.func_71353_a(Unknown Source)
at net.minecraft.client.Minecraft.func_71403_a(Unknown Source)
at com.mrnobody.morecommands.patch.NetHandlerPlayClient.func_147282_a(NetHandlerPlayClient.java:76)
at com.mumfrey.liteloader.client.PacketEventsClient.handlePacket(PacketEventsClient.java:106)
at com.mumfrey.liteloader.core.PacketEvents.handlePacketEvent(PacketEvents.java:179)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:134)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:129)
at com.mumfrey.liteloader.core.event.EventProxy$2.$event00046(Unknown Source)
at net.minecraft.network.play.server.S01PacketJoinGame.func_148833_a(SourceFile)
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:212)
-- Affected level --
Details:
Level name: MpServer
All players: 1 total; [EntityClientPlayerMP['MineSlaylehur'/527, l='MpServer', x=8.50, y=66.62, z=8.50]]
Chunk stats: MultiplayerChunkCache: 0, 0
Level seed: 0
Level generator: ID 00 - default, ver 1. Features enabled: false
Level generator options:
Level spawn location: World: (8,64,8), Chunk: (at 8,4,8 in 0,0; contains blocks 0,0,0 to 15,255,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,0,0 to 511,255,511)
Level time: 0 game time, 0 day time
Level dimension: 0
Level storage version: 0x00000 - Unknown?
Level weather: Rain time: 0 (now: false), thunder time: 0 (now: false)
Level game mode: Game mode: creative (ID 1). Hardcore: false. Cheats: false
Forced entities: 0 total; []
Retry entities: 0 total; []
Server brand: ~~NULL~~
Server type: Integrated singleplayer server
Stacktrace:
at net.minecraft.client.multiplayer.WorldClient.func_72914_a(WorldClient.java:373)
at net.minecraft.client.Minecraft.func_71396_d(Unknown Source)
at net.minecraft.client.Minecraft.func_99999_d(Unknown Source)
at net.minecraft.client.main.Main.main(SourceFile:148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at net.minecraft.launchwrapper.Launch.launch(Launch.java:135)
at net.minecraft.launchwrapper.Launch.main(Launch.java:28)
-- System Details --
Details:
Minecraft Version: 1.7.10
Operating System: Windows 10 (amd64) version 10.0
Java Version: 1.8.0_25, Oracle Corporation
Java VM Version: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation
Memory: 3290734152 bytes (3138 MB) / 5248544768 bytes (5005 MB) up to 8160477184 bytes (7782 MB)
Mod Pack: Unknown / None
LiteLoader Mods: 2 loaded mod(s)
- WorldEditCUI version 1.7.10_00
- WorldEditWrapper version 1.2.0
LaunchWrapper: 46 active transformer(s)
- Transformer: cpw.mods.fml.common.asm.transformers.PatchingTransformer
- Transformer: optifine.OptiFineClassTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.MarkerTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.SideTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.EventSubscriptionTransformer
- Transformer: net.minecraftforge.classloading.FluidIdTransformer
- Transformer: codechicken.lib.asm.ClassHeirachyManager
- Transformer: codechicken.core.asm.InterfaceDependancyTransformer
- Transformer: codechicken.core.asm.TweakTransformer
- Transformer: codechicken.core.asm.DelegatedTransformer
- Transformer: codechicken.core.asm.DefaultImplementationTransformer
- Transformer: fastcraft.asm.FastCraftTransformer
- Transformer: mcp.mobius.mobiuscore.asm.CoreTransformer
- Transformer: codechicken.nei.asm.NEITransformer
- Transformer: com.mumfrey.liteloader.transformers.event.EventProxyTransformer
- Transformer: com.mumfrey.liteloader.launch.LiteLoaderTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.CrashReportTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.DeobfuscationTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.AccessTransformer
- Transformer: net.minecraftforge.transformers.ForgeAccessTransformer
- Transformer: pl.asie.lib.core.AsieLibCoremodTransformer
- Transformer: codechicken.core.asm.CodeChickenAccessTransformer
- Transformer: net.malisis.core.asm.MalisisCoreAccessTransformer
- Transformer: mods.battlegear2.coremod.transformers.BattlegearAccessTransformer
- Transformer: mcp.mobius.mobiuscore.asm.CoreAccessTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ModAccessTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ItemStackTransformer
- Transformer: net.malisis.core.asm.MalisisCoreTransformer
- Transformer: com.mrnobody.morecommands.asm.transform.TransformBlockRailBase
- Transformer: mods.battlegear2.coremod.transformers.EntityPlayerTransformer
- Transformer: mods.battlegear2.coremod.transformers.ModelBipedTransformer
- Transformer: mods.battlegear2.coremod.transformers.NetClientHandlerTransformer
- Transformer: mods.battlegear2.coremod.transformers.NetServerHandlerTransformer
- Transformer: mods.battlegear2.coremod.transformers.PlayerControllerMPTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemRendererTransformer
- Transformer: mods.battlegear2.coremod.transformers.MinecraftTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemStackTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemInWorldTransformer
- Transformer: mods.battlegear2.coremod.transformers.EntityAIControlledByPlayerTransformer
- Transformer: mods.battlegear2.coremod.transformers.EntityOtherPlayerMPTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.TerminalTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.LiteLoaderEventInjectionTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.MinecraftOverlayTransformer
- Transformer: com.mumfrey.worldeditwrapper.asm.InteractionTransformer
- Transformer: com.mumfrey.liteloader.common.transformers.LiteLoaderPacketTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ModAPITransformer
JVM Flags: 6 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xmx8G -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode -XX:-UseAdaptiveSizePolicy -Xmn4096M
AABB Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used
IntCache: cache: 0, tcache: 0, allocated: 12, tallocated: 94
FML: MCP v9.05 FML v7.10.99.99 Minecraft Forge 10.13.4.1517 Optifine OptiFine_1.7.10_HD_U_C1 73 mods loaded, 72 mods active
States: 'U' = Unloaded 'L' = Loaded 'C' = Constructed 'H' = Pre-initialized 'I' = Initialized 'J' = Post-initialized 'A' = Available 'D' = Disabled 'E' = Errored
UCHIJAAAA mcp{9.05} [Minecraft Coder Pack] (minecraft.jar)
UCHIJAAAA FML{7.10.99.99} [Forge Mod Loader] (forge-1.7.10-10.13.4.1517-1.7.10.jar)
UCHIJAAAA Forge{10.13.4.1517} [Minecraft Forge] (forge-1.7.10-10.13.4.1517-1.7.10.jar)
UCHIJAAAA CodeChickenCore{1.0.7.46} [CodeChicken Core] (minecraft.jar)
UCHIJAAAA MobiusCore{1.2.5} [MobiusCore] (minecraft.jar)
UCHIJAAAA NotEnoughItems{1.0.5.111} [Not Enough Items] (NotEnoughItems-1.7.10-1.0.5.111-universal.jar)
UCHIJAAAA malisiscore{1.7.10-0.10.5+unknown-b0.git-unknown} [Malisis Core] (Malisis' Core.jar)
UCHIJAAAA FLabsBF{4.3} [Better Furnaces] ([1.7.10]Better_Furnaces_V4.3.jar)
UCHIJAAAA Baubles{1.0.1.10} [Baubles] (Baubles.jar)
UCHIJAAAA surpriseeggs{1.0.6} [Surprise Eggs] (Adventure Backpack.jar)
UCHIJAAAA adventurebackpack{1.7.10-0.8b} [Adventure Backpack] (Adventure Backpack.jar)
UCHIJAAAA AnimationAPI{1.2.4} [AnimationAPI] (AnimationAPI.jar)
UCHIJAAAA armourersWorkshop{1.7.10-0.33.0.84} [Armourer's Workshop] (Armourer's Workshop.jar)
UCHIJAAAA asielib{0.2.7} [asielib] (asielib.jar)
UCHIJAAAA Backpack{2.0.1} [Backpack] (backpack-2.0.1-1.7.x.jar)
UCHIJAAAA BiblioCraft{1.10.5} [BiblioCraft] (BiblioCraft.jar)
UCHIJAAAA CarpentersBlocks{3.3.6} [Carpenter's Blocks] (Carpenter's Blocks.jar)
UCHIJAAAA Railcraft{9.6.1.0} [Railcraft] (Railcraft.jar)
UCHIJAAAA TwilightForest{2.3.7} [The Twilight Forest] (twilightforest-1.7.10-2.3.7.jar)
UCHIJAAAA chisel{2.3.10.37} [Chisel 2] (Chisel 2.jar)
UCHIJAAAA RDMenuServer{2.0.0.2} [RDMenuServer] (CustomMenu.jar)
UCHIJAAAA customnpcs{1.7.10d} [CustomNpcs] (CustomNpcs.jar)
UCHIJAAAA DamageIndicatorsMod{3.2.3} [Damage Indicators] (Damage Indicators.jar)
UCHIJAAAA darkcore{0.3} [Dark Core] (Dark Core.jar)
UCHIJAAAA props{2.0.2} [Decocraft] (Decocraft.jar)
UCHIJAAAA EE3{0.3.507} [Equivalent Exchange 3] (Equivalent Exchange 3.jar)
UCHIJAAAA FastCraft{1.21} [FastCraft] (fastcraft-1.21.jar)
UCHIJAAAA FoodPlus{3.2rS} [§bFood Plus] (FoodPlus.jar)
UCHIJAAAA iChunUtil{4.2.2} [iChunUtil] (iChunUtil.jar)
UCHIJAAAA GraviGun{4.0.0-beta} [GraviGun] (GravityGun.jar)
UCHIJAAAA Hats{4.0.1} [Hats] (Hats.jar)
UCHIJAAAA HatStand{4.0.0} [HatStand] (HatStand-4.0.0.jar)
UCHIJAAAA infiniteinvo{1.0.44} [InfiniteInvo] (InfiniteInvo-1.0.44.jar)
UCHIJAAAA InventoryPets{1.1.2a} [Inventory Pets] (Inventory Pets.jar)
UCHIJAAAA IronChest{6.0.60.741} [Iron Chest] (Iron Chest.jar)
UCHIJAAAA pacman{1.0.7} [Killer Pacman] (Killer Pacman.jar)
UCHIJAAAA lucky{5.1.0} [Lucky Block] (LuckyBlock.jar)
UCHIJAAAA malisisdoors{1.7.10-1.4.3} [Malisis' Doors] (Malisis' Doors.jar)
UCHIJAAAA mcheli{0.10.6} [MC Helicopter] (MC Helicopter.zip)
UCHIJAAAA battlegear2{1.7.10} [Mine & Blade Battlegear 2 - Bullseye] (Mine & Blade Battlegear 2 - Bullseye.jar)
UCHIJAAAA MobProperties{0.4.0} [Mob Properties] (Mob Properties.jar)
UCHIJAAAA mrnobody_morecommands{1.6} [More Commands] (MoreCommands-1.7.10-1.6.jar)
UCHIJAAAA Morph{0.9.1} [Morph] (Morph.jar)
UCHIJAAAA cfm{3.4.7} [§9MrCrayfish's Furniture Mod] (MrCrayfish's Furniture Mod.jar)
UCHIJAAAA MultiPageChest{1.3.2} [Multi Page Chest] (Multi-Page-Chest-Mod-1.7.10.jar)
UCHIJAAAA MutantCreatures{1.4.8} [Mutant Creatures] (Mutant Creatures.jar)
UCHIJAAAA NEIAddons{1.12.10.33} [NEI Addons] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|AppEng{1.12.10.33} [NEI Addons: Applied Energistics 2] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|Botany{1.12.10.33} [NEI Addons: Botany] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|Forestry{1.12.10.33} [NEI Addons: Forestry] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|CraftingTables{1.12.10.33} [NEI Addons: Crafting Tables] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|ExNihilo{1.12.10.33} [NEI Addons: Ex Nihilo] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA MapWriter{2.1.2} [MapWriter] (Opis.jar)
UCHIJAAAA Opis{1.2.5} [Opis] (Opis.jar)
UCHIJAAAA origin{3.3.0} [Origin] (Origin.jar)
UCHIJAAAA pandorasbox{2.0.1} [Pandora's Box] (Pandora's Box.jar)
UCHIJAAAA PortalGun{4.0.0-beta-4} [PortalGun] (PortalGun-4.0.0-beta-4.jar)
UCHIJAAAA AS_Ruins{15.1} [Ruins Spawning System] (Ruins-1.7.10.jar)
UCHIJAAAA SaintsCore{0.8} [Saintscore] (Saintscore.jar)
UCHIJAAAA securitycraft{v1.7.4.1} [SecurityCraft] (SecurityCraft.jar)
UCHIJAAAA SSTOW{1.7.10-0.1-RC9-7} [Soul Shards: The Old Ways] (Soul Shards- The Old Ways.jar)
UCHIJAAAA statues{2.1.3} [Statues] (Statues.jar)
UCHIJAAAA TardisMod{0.99} [Tardis Mod] (Tardis Mod.jar)
UCHIJAAAA OreSpawn{1.7.10.20.3} [OreSpawn] (The OreSpawn Mod.zip)
UCHIJAAAA secretroomsmod{4.7.1} [The SecretRoomsMod] (The SecretRoomsMod.jar)
UCHIJAAAA TrailMix{4.0.0} [TrailMix] (TrailMix.jar)
UCHIJAAAA transformers{${version}} [Transformers Mod] (Transformers Mod.jar)
UCHIJAAAA jordy141minecraftmagicwrench{0.5 Alpha} [Magic Wrench] (UncraftingTable-1.7.10-alpha1.jar)
UCHIJAAAA uncraftingTable{1.7.10 Alpha 1} [Uncrafting Table] (UncraftingTable-1.7.10-alpha1.jar)
UCHIJAAAA weepingangels{3.3.2} [Weeping Angels] (Weeping Angels.jar)
UCHIJAAAA witchery{0.24.1} [Witchery] (Witchery.jar)
UCHIJAAAA Kradxns Minimap{2.0} [Minimap] (Xray2.02.jar)
UD asielibcore{} [AsieLib CoreMod] (minecraft.jar)
GL info: ' Vendor: 'NVIDIA Corporation' Version: '4.5.0 NVIDIA 355.82' Renderer: 'GeForce GTX 780/PCIe/SSE2'
Launched Version: 1.7.10-Forge10.13.4.1517-1.7.10
LWJGL: 2.9.1
OpenGL: GeForce GTX 780/PCIe/SSE2 GL version 4.5.0 NVIDIA 355.82, NVIDIA Corporation
GL Caps: Using GL 1.3 multitexturing.
Using framebuffer objects because OpenGL 3.0 is supported and separate blending is supported.
Anisotropic filtering is supported and maximum anisotropy is 16.
Shaders are available because OpenGL 2.1 is supported.
Is Modded: Definitely; Client brand changed to 'fml,forge'
Type: Client (map_client.txt)
Resource Packs: [YoutubersSkin]
Current Language: English (US)
Profiler Position: N/A (disabled)
Vec3 Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used
Anisotropic Filtering: On (16)
|
True
|
Does not work with CrazyCraft Modpack -
---- Minecraft Crash Report ----
// Surprise! Haha. Well, this is awkward.
Time: 10/26/15 9:48 PM
Description: Unexpected error
java.lang.RuntimeException: Player inventory has been replaced with class infiniteinvo.inventory.BigInventoryPlayer
at mods.battlegear2.BattlemodeHookContainerClass.onEntityJoin(BattlemodeHookContainerClass.java:62)
at cpw.mods.fml.common.eventhandler.ASMEventHandler_156_BattlemodeHookContainerClass_onEntityJoin_EntityJoinWorldEvent.invoke(.dynamic)
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54)
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140)
at net.minecraft.world.World.func_72838_d(World.java:1334)
at net.minecraft.client.multiplayer.WorldClient.func_72838_d(WorldClient.java:159)
at net.minecraft.client.Minecraft.func_71353_a(Unknown Source)
at net.minecraft.client.Minecraft.func_71403_a(Unknown Source)
at com.mrnobody.morecommands.patch.NetHandlerPlayClient.func_147282_a(NetHandlerPlayClient.java:76)
at com.mumfrey.liteloader.client.PacketEventsClient.handlePacket(PacketEventsClient.java:106)
at com.mumfrey.liteloader.core.PacketEvents.handlePacketEvent(PacketEvents.java:179)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:134)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:129)
at com.mumfrey.liteloader.core.event.EventProxy$2.$event00046(Unknown Source)
at net.minecraft.network.play.server.S01PacketJoinGame.func_148833_a(SourceFile)
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:212)
at net.minecraft.client.Minecraft.func_71407_l(Unknown Source)
at net.minecraft.client.Minecraft.func_71411_J(Unknown Source)
at net.minecraft.client.Minecraft.func_99999_d(Unknown Source)
at net.minecraft.client.main.Main.main(SourceFile:148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at net.minecraft.launchwrapper.Launch.launch(Launch.java:135)
at net.minecraft.launchwrapper.Launch.main(Launch.java:28)
A detailed walkthrough of the error, its code path and all known details is as follows:
---------------------------------------------------------------------------------------
-- Head --
Stacktrace:
at mods.battlegear2.BattlemodeHookContainerClass.onEntityJoin(BattlemodeHookContainerClass.java:62)
at cpw.mods.fml.common.eventhandler.ASMEventHandler_156_BattlemodeHookContainerClass_onEntityJoin_EntityJoinWorldEvent.invoke(.dynamic)
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54)
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140)
at net.minecraft.world.World.func_72838_d(World.java:1334)
at net.minecraft.client.multiplayer.WorldClient.func_72838_d(WorldClient.java:159)
at net.minecraft.client.Minecraft.func_71353_a(Unknown Source)
at net.minecraft.client.Minecraft.func_71403_a(Unknown Source)
at com.mrnobody.morecommands.patch.NetHandlerPlayClient.func_147282_a(NetHandlerPlayClient.java:76)
at com.mumfrey.liteloader.client.PacketEventsClient.handlePacket(PacketEventsClient.java:106)
at com.mumfrey.liteloader.core.PacketEvents.handlePacketEvent(PacketEvents.java:179)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:134)
at com.mumfrey.liteloader.core.PacketEvents.handlePacket(PacketEvents.java:129)
at com.mumfrey.liteloader.core.event.EventProxy$2.$event00046(Unknown Source)
at net.minecraft.network.play.server.S01PacketJoinGame.func_148833_a(SourceFile)
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:212)
-- Affected level --
Details:
Level name: MpServer
All players: 1 total; [EntityClientPlayerMP['MineSlaylehur'/527, l='MpServer', x=8.50, y=66.62, z=8.50]]
Chunk stats: MultiplayerChunkCache: 0, 0
Level seed: 0
Level generator: ID 00 - default, ver 1. Features enabled: false
Level generator options:
Level spawn location: World: (8,64,8), Chunk: (at 8,4,8 in 0,0; contains blocks 0,0,0 to 15,255,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,0,0 to 511,255,511)
Level time: 0 game time, 0 day time
Level dimension: 0
Level storage version: 0x00000 - Unknown?
Level weather: Rain time: 0 (now: false), thunder time: 0 (now: false)
Level game mode: Game mode: creative (ID 1). Hardcore: false. Cheats: false
Forced entities: 0 total; []
Retry entities: 0 total; []
Server brand: ~~NULL~~
Server type: Integrated singleplayer server
Stacktrace:
at net.minecraft.client.multiplayer.WorldClient.func_72914_a(WorldClient.java:373)
at net.minecraft.client.Minecraft.func_71396_d(Unknown Source)
at net.minecraft.client.Minecraft.func_99999_d(Unknown Source)
at net.minecraft.client.main.Main.main(SourceFile:148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at net.minecraft.launchwrapper.Launch.launch(Launch.java:135)
at net.minecraft.launchwrapper.Launch.main(Launch.java:28)
-- System Details --
Details:
Minecraft Version: 1.7.10
Operating System: Windows 10 (amd64) version 10.0
Java Version: 1.8.0_25, Oracle Corporation
Java VM Version: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation
Memory: 3290734152 bytes (3138 MB) / 5248544768 bytes (5005 MB) up to 8160477184 bytes (7782 MB)
Mod Pack: Unknown / None
LiteLoader Mods: 2 loaded mod(s)
- WorldEditCUI version 1.7.10_00
- WorldEditWrapper version 1.2.0
LaunchWrapper: 46 active transformer(s)
- Transformer: cpw.mods.fml.common.asm.transformers.PatchingTransformer
- Transformer: optifine.OptiFineClassTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.MarkerTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.SideTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.EventSubscriptionTransformer
- Transformer: net.minecraftforge.classloading.FluidIdTransformer
- Transformer: codechicken.lib.asm.ClassHeirachyManager
- Transformer: codechicken.core.asm.InterfaceDependancyTransformer
- Transformer: codechicken.core.asm.TweakTransformer
- Transformer: codechicken.core.asm.DelegatedTransformer
- Transformer: codechicken.core.asm.DefaultImplementationTransformer
- Transformer: fastcraft.asm.FastCraftTransformer
- Transformer: mcp.mobius.mobiuscore.asm.CoreTransformer
- Transformer: codechicken.nei.asm.NEITransformer
- Transformer: com.mumfrey.liteloader.transformers.event.EventProxyTransformer
- Transformer: com.mumfrey.liteloader.launch.LiteLoaderTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.CrashReportTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.DeobfuscationTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.AccessTransformer
- Transformer: net.minecraftforge.transformers.ForgeAccessTransformer
- Transformer: pl.asie.lib.core.AsieLibCoremodTransformer
- Transformer: codechicken.core.asm.CodeChickenAccessTransformer
- Transformer: net.malisis.core.asm.MalisisCoreAccessTransformer
- Transformer: mods.battlegear2.coremod.transformers.BattlegearAccessTransformer
- Transformer: mcp.mobius.mobiuscore.asm.CoreAccessTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ModAccessTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ItemStackTransformer
- Transformer: net.malisis.core.asm.MalisisCoreTransformer
- Transformer: com.mrnobody.morecommands.asm.transform.TransformBlockRailBase
- Transformer: mods.battlegear2.coremod.transformers.EntityPlayerTransformer
- Transformer: mods.battlegear2.coremod.transformers.ModelBipedTransformer
- Transformer: mods.battlegear2.coremod.transformers.NetClientHandlerTransformer
- Transformer: mods.battlegear2.coremod.transformers.NetServerHandlerTransformer
- Transformer: mods.battlegear2.coremod.transformers.PlayerControllerMPTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemRendererTransformer
- Transformer: mods.battlegear2.coremod.transformers.MinecraftTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemStackTransformer
- Transformer: mods.battlegear2.coremod.transformers.ItemInWorldTransformer
- Transformer: mods.battlegear2.coremod.transformers.EntityAIControlledByPlayerTransformer
- Transformer: mods.battlegear2.coremod.transformers.EntityOtherPlayerMPTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.TerminalTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.LiteLoaderEventInjectionTransformer
- Transformer: com.mumfrey.liteloader.client.transformers.MinecraftOverlayTransformer
- Transformer: com.mumfrey.worldeditwrapper.asm.InteractionTransformer
- Transformer: com.mumfrey.liteloader.common.transformers.LiteLoaderPacketTransformer
- Transformer: cpw.mods.fml.common.asm.transformers.ModAPITransformer
JVM Flags: 6 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xmx8G -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode -XX:-UseAdaptiveSizePolicy -Xmn4096M
AABB Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used
IntCache: cache: 0, tcache: 0, allocated: 12, tallocated: 94
FML: MCP v9.05 FML v7.10.99.99 Minecraft Forge 10.13.4.1517 Optifine OptiFine_1.7.10_HD_U_C1 73 mods loaded, 72 mods active
States: 'U' = Unloaded 'L' = Loaded 'C' = Constructed 'H' = Pre-initialized 'I' = Initialized 'J' = Post-initialized 'A' = Available 'D' = Disabled 'E' = Errored
UCHIJAAAA mcp{9.05} [Minecraft Coder Pack] (minecraft.jar)
UCHIJAAAA FML{7.10.99.99} [Forge Mod Loader] (forge-1.7.10-10.13.4.1517-1.7.10.jar)
UCHIJAAAA Forge{10.13.4.1517} [Minecraft Forge] (forge-1.7.10-10.13.4.1517-1.7.10.jar)
UCHIJAAAA CodeChickenCore{1.0.7.46} [CodeChicken Core] (minecraft.jar)
UCHIJAAAA MobiusCore{1.2.5} [MobiusCore] (minecraft.jar)
UCHIJAAAA NotEnoughItems{1.0.5.111} [Not Enough Items] (NotEnoughItems-1.7.10-1.0.5.111-universal.jar)
UCHIJAAAA malisiscore{1.7.10-0.10.5+unknown-b0.git-unknown} [Malisis Core] (Malisis' Core.jar)
UCHIJAAAA FLabsBF{4.3} [Better Furnaces] ([1.7.10]Better_Furnaces_V4.3.jar)
UCHIJAAAA Baubles{1.0.1.10} [Baubles] (Baubles.jar)
UCHIJAAAA surpriseeggs{1.0.6} [Surprise Eggs] (Adventure Backpack.jar)
UCHIJAAAA adventurebackpack{1.7.10-0.8b} [Adventure Backpack] (Adventure Backpack.jar)
UCHIJAAAA AnimationAPI{1.2.4} [AnimationAPI] (AnimationAPI.jar)
UCHIJAAAA armourersWorkshop{1.7.10-0.33.0.84} [Armourer's Workshop] (Armourer's Workshop.jar)
UCHIJAAAA asielib{0.2.7} [asielib] (asielib.jar)
UCHIJAAAA Backpack{2.0.1} [Backpack] (backpack-2.0.1-1.7.x.jar)
UCHIJAAAA BiblioCraft{1.10.5} [BiblioCraft] (BiblioCraft.jar)
UCHIJAAAA CarpentersBlocks{3.3.6} [Carpenter's Blocks] (Carpenter's Blocks.jar)
UCHIJAAAA Railcraft{9.6.1.0} [Railcraft] (Railcraft.jar)
UCHIJAAAA TwilightForest{2.3.7} [The Twilight Forest] (twilightforest-1.7.10-2.3.7.jar)
UCHIJAAAA chisel{2.3.10.37} [Chisel 2] (Chisel 2.jar)
UCHIJAAAA RDMenuServer{2.0.0.2} [RDMenuServer] (CustomMenu.jar)
UCHIJAAAA customnpcs{1.7.10d} [CustomNpcs] (CustomNpcs.jar)
UCHIJAAAA DamageIndicatorsMod{3.2.3} [Damage Indicators] (Damage Indicators.jar)
UCHIJAAAA darkcore{0.3} [Dark Core] (Dark Core.jar)
UCHIJAAAA props{2.0.2} [Decocraft] (Decocraft.jar)
UCHIJAAAA EE3{0.3.507} [Equivalent Exchange 3] (Equivalent Exchange 3.jar)
UCHIJAAAA FastCraft{1.21} [FastCraft] (fastcraft-1.21.jar)
UCHIJAAAA FoodPlus{3.2rS} [§bFood Plus] (FoodPlus.jar)
UCHIJAAAA iChunUtil{4.2.2} [iChunUtil] (iChunUtil.jar)
UCHIJAAAA GraviGun{4.0.0-beta} [GraviGun] (GravityGun.jar)
UCHIJAAAA Hats{4.0.1} [Hats] (Hats.jar)
UCHIJAAAA HatStand{4.0.0} [HatStand] (HatStand-4.0.0.jar)
UCHIJAAAA infiniteinvo{1.0.44} [InfiniteInvo] (InfiniteInvo-1.0.44.jar)
UCHIJAAAA InventoryPets{1.1.2a} [Inventory Pets] (Inventory Pets.jar)
UCHIJAAAA IronChest{6.0.60.741} [Iron Chest] (Iron Chest.jar)
UCHIJAAAA pacman{1.0.7} [Killer Pacman] (Killer Pacman.jar)
UCHIJAAAA lucky{5.1.0} [Lucky Block] (LuckyBlock.jar)
UCHIJAAAA malisisdoors{1.7.10-1.4.3} [Malisis' Doors] (Malisis' Doors.jar)
UCHIJAAAA mcheli{0.10.6} [MC Helicopter] (MC Helicopter.zip)
UCHIJAAAA battlegear2{1.7.10} [Mine & Blade Battlegear 2 - Bullseye] (Mine & Blade Battlegear 2 - Bullseye.jar)
UCHIJAAAA MobProperties{0.4.0} [Mob Properties] (Mob Properties.jar)
UCHIJAAAA mrnobody_morecommands{1.6} [More Commands] (MoreCommands-1.7.10-1.6.jar)
UCHIJAAAA Morph{0.9.1} [Morph] (Morph.jar)
UCHIJAAAA cfm{3.4.7} [§9MrCrayfish's Furniture Mod] (MrCrayfish's Furniture Mod.jar)
UCHIJAAAA MultiPageChest{1.3.2} [Multi Page Chest] (Multi-Page-Chest-Mod-1.7.10.jar)
UCHIJAAAA MutantCreatures{1.4.8} [Mutant Creatures] (Mutant Creatures.jar)
UCHIJAAAA NEIAddons{1.12.10.33} [NEI Addons] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|AppEng{1.12.10.33} [NEI Addons: Applied Energistics 2] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|Botany{1.12.10.33} [NEI Addons: Botany] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|Forestry{1.12.10.33} [NEI Addons: Forestry] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|CraftingTables{1.12.10.33} [NEI Addons: Crafting Tables] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA NEIAddons|ExNihilo{1.12.10.33} [NEI Addons: Ex Nihilo] (NEI Addons- Ex Nihilo.jar)
UCHIJAAAA MapWriter{2.1.2} [MapWriter] (Opis.jar)
UCHIJAAAA Opis{1.2.5} [Opis] (Opis.jar)
UCHIJAAAA origin{3.3.0} [Origin] (Origin.jar)
UCHIJAAAA pandorasbox{2.0.1} [Pandora's Box] (Pandora's Box.jar)
UCHIJAAAA PortalGun{4.0.0-beta-4} [PortalGun] (PortalGun-4.0.0-beta-4.jar)
UCHIJAAAA AS_Ruins{15.1} [Ruins Spawning System] (Ruins-1.7.10.jar)
UCHIJAAAA SaintsCore{0.8} [Saintscore] (Saintscore.jar)
UCHIJAAAA securitycraft{v1.7.4.1} [SecurityCraft] (SecurityCraft.jar)
UCHIJAAAA SSTOW{1.7.10-0.1-RC9-7} [Soul Shards: The Old Ways] (Soul Shards- The Old Ways.jar)
UCHIJAAAA statues{2.1.3} [Statues] (Statues.jar)
UCHIJAAAA TardisMod{0.99} [Tardis Mod] (Tardis Mod.jar)
UCHIJAAAA OreSpawn{1.7.10.20.3} [OreSpawn] (The OreSpawn Mod.zip)
UCHIJAAAA secretroomsmod{4.7.1} [The SecretRoomsMod] (The SecretRoomsMod.jar)
UCHIJAAAA TrailMix{4.0.0} [TrailMix] (TrailMix.jar)
UCHIJAAAA transformers{${version}} [Transformers Mod] (Transformers Mod.jar)
UCHIJAAAA jordy141minecraftmagicwrench{0.5 Alpha} [Magic Wrench] (UncraftingTable-1.7.10-alpha1.jar)
UCHIJAAAA uncraftingTable{1.7.10 Alpha 1} [Uncrafting Table] (UncraftingTable-1.7.10-alpha1.jar)
UCHIJAAAA weepingangels{3.3.2} [Weeping Angels] (Weeping Angels.jar)
UCHIJAAAA witchery{0.24.1} [Witchery] (Witchery.jar)
UCHIJAAAA Kradxns Minimap{2.0} [Minimap] (Xray2.02.jar)
UD asielibcore{} [AsieLib CoreMod] (minecraft.jar)
GL info: ' Vendor: 'NVIDIA Corporation' Version: '4.5.0 NVIDIA 355.82' Renderer: 'GeForce GTX 780/PCIe/SSE2'
Launched Version: 1.7.10-Forge10.13.4.1517-1.7.10
LWJGL: 2.9.1
OpenGL: GeForce GTX 780/PCIe/SSE2 GL version 4.5.0 NVIDIA 355.82, NVIDIA Corporation
GL Caps: Using GL 1.3 multitexturing.
Using framebuffer objects because OpenGL 3.0 is supported and separate blending is supported.
Anisotropic filtering is supported and maximum anisotropy is 16.
Shaders are available because OpenGL 2.1 is supported.
Is Modded: Definitely; Client brand changed to 'fml,forge'
Type: Client (map_client.txt)
Resource Packs: [YoutubersSkin]
Current Language: English (US)
Profiler Position: N/A (disabled)
Vec3 Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used
Anisotropic Filtering: On (16)
|
non_process
|
does not work with crazycraft modpack minecraft crash report surprise haha well this is awkward time pm description unexpected error java lang runtimeexception player inventory has been replaced with class infiniteinvo inventory biginventoryplayer at mods battlemodehookcontainerclass onentityjoin battlemodehookcontainerclass java at cpw mods fml common eventhandler asmeventhandler battlemodehookcontainerclass onentityjoin entityjoinworldevent invoke dynamic at cpw mods fml common eventhandler asmeventhandler invoke asmeventhandler java at cpw mods fml common eventhandler eventbus post eventbus java at net minecraft world world func d world java at net minecraft client multiplayer worldclient func d worldclient java at net minecraft client minecraft func a unknown source at net minecraft client minecraft func a unknown source at com mrnobody morecommands patch nethandlerplayclient func a nethandlerplayclient java at com mumfrey liteloader client packeteventsclient handlepacket packeteventsclient java at com mumfrey liteloader core packetevents handlepacketevent packetevents java at com mumfrey liteloader core packetevents handlepacket packetevents java at com mumfrey liteloader core packetevents handlepacket packetevents java at com mumfrey liteloader core event eventproxy unknown source at net minecraft network play server func a sourcefile at net minecraft network networkmanager func b networkmanager java at net minecraft client minecraft func l unknown source at net minecraft client minecraft func j unknown source at net minecraft client minecraft func d unknown source at net minecraft client main main main sourcefile at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at net minecraft launchwrapper launch launch launch java at net minecraft launchwrapper launch main launch java a detailed walkthrough of the error its code path and all known details is as follows head stacktrace at mods battlemodehookcontainerclass onentityjoin battlemodehookcontainerclass java at cpw mods fml common eventhandler asmeventhandler battlemodehookcontainerclass onentityjoin entityjoinworldevent invoke dynamic at cpw mods fml common eventhandler asmeventhandler invoke asmeventhandler java at cpw mods fml common eventhandler eventbus post eventbus java at net minecraft world world func d world java at net minecraft client multiplayer worldclient func d worldclient java at net minecraft client minecraft func a unknown source at net minecraft client minecraft func a unknown source at com mrnobody morecommands patch nethandlerplayclient func a nethandlerplayclient java at com mumfrey liteloader client packeteventsclient handlepacket packeteventsclient java at com mumfrey liteloader core packetevents handlepacketevent packetevents java at com mumfrey liteloader core packetevents handlepacket packetevents java at com mumfrey liteloader core packetevents handlepacket packetevents java at com mumfrey liteloader core event eventproxy unknown source at net minecraft network play server func a sourcefile at net minecraft network networkmanager func b networkmanager java affected level details level name mpserver all players total chunk stats multiplayerchunkcache level seed level generator id default ver features enabled false level generator options level spawn location world chunk at in contains blocks to region contains chunks to blocks to level time game time day time level dimension level storage version unknown level weather rain time now false thunder time now false level game mode game mode creative id hardcore false cheats false forced entities total retry entities total server brand null server type integrated singleplayer server stacktrace at net minecraft client multiplayer worldclient func a worldclient java at net minecraft client minecraft func d unknown source at net minecraft client minecraft func d unknown source at net minecraft client main main main sourcefile at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at net minecraft launchwrapper launch launch launch java at net minecraft launchwrapper launch main launch java system details details minecraft version operating system windows version java version oracle corporation java vm version java hotspot tm bit server vm mixed mode oracle corporation memory bytes mb bytes mb up to bytes mb mod pack unknown none liteloader mods loaded mod s worldeditcui version worldeditwrapper version launchwrapper active transformer s transformer cpw mods fml common asm transformers patchingtransformer transformer optifine optifineclasstransformer transformer cpw mods fml common asm transformers markertransformer transformer cpw mods fml common asm transformers sidetransformer transformer cpw mods fml common asm transformers eventsubscriptiontransformer transformer net minecraftforge classloading fluididtransformer transformer codechicken lib asm classheirachymanager transformer codechicken core asm interfacedependancytransformer transformer codechicken core asm tweaktransformer transformer codechicken core asm delegatedtransformer transformer codechicken core asm defaultimplementationtransformer transformer fastcraft asm fastcrafttransformer transformer mcp mobius mobiuscore asm coretransformer transformer codechicken nei asm neitransformer transformer com mumfrey liteloader transformers event eventproxytransformer transformer com mumfrey liteloader launch liteloadertransformer transformer com mumfrey liteloader client transformers crashreporttransformer transformer cpw mods fml common asm transformers deobfuscationtransformer transformer cpw mods fml common asm transformers accesstransformer transformer net minecraftforge transformers forgeaccesstransformer transformer pl asie lib core asielibcoremodtransformer transformer codechicken core asm codechickenaccesstransformer transformer net malisis core asm malisiscoreaccesstransformer transformer mods coremod transformers battlegearaccesstransformer transformer mcp mobius mobiuscore asm coreaccesstransformer transformer cpw mods fml common asm transformers modaccesstransformer transformer cpw mods fml common asm transformers itemstacktransformer transformer net malisis core asm malisiscoretransformer transformer com mrnobody morecommands asm transform transformblockrailbase transformer mods coremod transformers entityplayertransformer transformer mods coremod transformers modelbipedtransformer transformer mods coremod transformers netclienthandlertransformer transformer mods coremod transformers netserverhandlertransformer transformer mods coremod transformers playercontrollermptransformer transformer mods coremod transformers itemrenderertransformer transformer mods coremod transformers minecrafttransformer transformer mods coremod transformers itemstacktransformer transformer mods coremod transformers iteminworldtransformer transformer mods coremod transformers entityaicontrolledbyplayertransformer transformer mods coremod transformers entityotherplayermptransformer transformer cpw mods fml common asm transformers terminaltransformer transformer com mumfrey liteloader client transformers liteloadereventinjectiontransformer transformer com mumfrey liteloader client transformers minecraftoverlaytransformer transformer com mumfrey worldeditwrapper asm interactiontransformer transformer com mumfrey liteloader common transformers liteloaderpackettransformer transformer cpw mods fml common asm transformers modapitransformer jvm flags total xx heapdumppath mojangtricksinteldriversforperformance javaw exe minecraft exe heapdump xx useconcmarksweepgc xx cmsincrementalmode xx useadaptivesizepolicy aabb pool size bytes mb allocated bytes mb used intcache cache tcache allocated tallocated fml mcp fml minecraft forge optifine optifine hd u mods loaded mods active states u unloaded l loaded c constructed h pre initialized i initialized j post initialized a available d disabled e errored uchijaaaa mcp minecraft jar uchijaaaa fml forge jar uchijaaaa forge forge jar uchijaaaa codechickencore minecraft jar uchijaaaa mobiuscore minecraft jar uchijaaaa notenoughitems notenoughitems universal jar uchijaaaa malisiscore unknown git unknown malisis core jar uchijaaaa flabsbf better furnaces jar uchijaaaa baubles baubles jar uchijaaaa surpriseeggs adventure backpack jar uchijaaaa adventurebackpack adventure backpack jar uchijaaaa animationapi animationapi jar uchijaaaa armourersworkshop armourer s workshop jar uchijaaaa asielib asielib jar uchijaaaa backpack backpack x jar uchijaaaa bibliocraft bibliocraft jar uchijaaaa carpentersblocks carpenter s blocks jar uchijaaaa railcraft railcraft jar uchijaaaa twilightforest twilightforest jar uchijaaaa chisel chisel jar uchijaaaa rdmenuserver custommenu jar uchijaaaa customnpcs customnpcs jar uchijaaaa damageindicatorsmod damage indicators jar uchijaaaa darkcore dark core jar uchijaaaa props decocraft jar uchijaaaa equivalent exchange jar uchijaaaa fastcraft fastcraft jar uchijaaaa foodplus foodplus jar uchijaaaa ichunutil ichunutil jar uchijaaaa gravigun beta gravitygun jar uchijaaaa hats hats jar uchijaaaa hatstand hatstand jar uchijaaaa infiniteinvo infiniteinvo jar uchijaaaa inventorypets inventory pets jar uchijaaaa ironchest iron chest jar uchijaaaa pacman killer pacman jar uchijaaaa lucky luckyblock jar uchijaaaa malisisdoors malisis doors jar uchijaaaa mcheli mc helicopter zip uchijaaaa mine blade battlegear bullseye jar uchijaaaa mobproperties mob properties jar uchijaaaa mrnobody morecommands morecommands jar uchijaaaa morph morph jar uchijaaaa cfm mrcrayfish s furniture mod jar uchijaaaa multipagechest multi page chest mod jar uchijaaaa mutantcreatures mutant creatures jar uchijaaaa neiaddons nei addons ex nihilo jar uchijaaaa neiaddons appeng nei addons ex nihilo jar uchijaaaa neiaddons botany nei addons ex nihilo jar uchijaaaa neiaddons forestry nei addons ex nihilo jar uchijaaaa neiaddons craftingtables nei addons ex nihilo jar uchijaaaa neiaddons exnihilo nei addons ex nihilo jar uchijaaaa mapwriter opis jar uchijaaaa opis opis jar uchijaaaa origin origin jar uchijaaaa pandorasbox pandora s box jar uchijaaaa portalgun beta portalgun beta jar uchijaaaa as ruins ruins jar uchijaaaa saintscore saintscore jar uchijaaaa securitycraft securitycraft jar uchijaaaa sstow soul shards the old ways jar uchijaaaa statues statues jar uchijaaaa tardismod tardis mod jar uchijaaaa orespawn the orespawn mod zip uchijaaaa secretroomsmod the secretroomsmod jar uchijaaaa trailmix trailmix jar uchijaaaa transformers version transformers mod jar uchijaaaa alpha uncraftingtable jar uchijaaaa uncraftingtable alpha uncraftingtable jar uchijaaaa weepingangels weeping angels jar uchijaaaa witchery witchery jar uchijaaaa kradxns minimap jar ud asielibcore minecraft jar gl info vendor nvidia corporation version nvidia renderer geforce gtx pcie launched version lwjgl opengl geforce gtx pcie gl version nvidia nvidia corporation gl caps using gl multitexturing using framebuffer objects because opengl is supported and separate blending is supported anisotropic filtering is supported and maximum anisotropy is shaders are available because opengl is supported is modded definitely client brand changed to fml forge type client map client txt resource packs current language english us profiler position n a disabled pool size bytes mb allocated bytes mb used anisotropic filtering on
| 0
|
111,707
| 11,740,202,799
|
IssuesEvent
|
2020-03-11 19:08:17
|
MovingBlocks/TerasologyLauncher
|
https://api.github.com/repos/MovingBlocks/TerasologyLauncher
|
opened
|
Update download link on terasology.org
|
Status: Blocked Topic: Documentation Type: Maintenance
|
1. Use latest launcher release
2. Use OS-specific artifact:
- Option 1: https://github.com/MovingBlocks/movingblocks.github.com/issues/55
- Option 2: Detect OS via browser and select zip accordingly
Currently, (2) is blocked by the latest release (currently v3.3.0) not providing OS-specific artifacts.
|
1.0
|
Update download link on terasology.org - 1. Use latest launcher release
2. Use OS-specific artifact:
- Option 1: https://github.com/MovingBlocks/movingblocks.github.com/issues/55
- Option 2: Detect OS via browser and select zip accordingly
Currently, (2) is blocked by the latest release (currently v3.3.0) not providing OS-specific artifacts.
|
non_process
|
update download link on terasology org use latest launcher release use os specific artifact option option detect os via browser and select zip accordingly currently is blocked by the latest release currently not providing os specific artifacts
| 0
|
21,933
| 30,446,677,227
|
IssuesEvent
|
2023-07-15 19:08:54
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pyutils 0.0.1b8 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b8",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpikiokiww/pyutils"
}
}```
|
1.0
|
pyutils 0.0.1b8 has 2 GuardDog issues - https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b8",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpikiokiww/pyutils"
}
}```
|
process
|
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt python utils pytils silent process execution location pyutils exec utils py pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmpikiokiww pyutils
| 1
|
68,476
| 17,315,446,946
|
IssuesEvent
|
2021-07-27 05:05:13
|
google/mediapipe
|
https://api.github.com/repos/google/mediapipe
|
closed
|
Errors to link absl. ld: symbol(s) not found for architecture arm64
|
platform:ios solution:hands stalled stat:awaiting response type:build/install
|
I tried to build a framework for iOS. But I got the following link errors.
Could you give me some suggestions to fix this issue? Thanks!
============Link errors==========
Undefined symbols for architecture arm64:
"absl::lts_2020_09_23::InternalError(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::strings_internal::CatPieces(std::initializer_list<absl::lts_2020_09_23::string_view>)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::InvalidArgumentError(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
absl::lts_2020_09_23::StatusOr<std::__1::vector<google::protobuf::MessageLite const*, std::__1::allocator<google::protobuf::MessageLite const*> > > mediapipe::packet_internal::ConvertToVectorOfProtoMessageLitePtrs<mediapipe::NormalizedLandmarkList>(mediapipe::NormalizedLandmarkList const*, std::__1::integral_constant<bool, false>) in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::ByString::ByString(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::strings_internal::Splitter<absl::lts_2020_09_23::strings_internal::SelectDelimiter<char const*>::type, absl::lts_2020_09_23::AllowEmpty> absl::lts_2020_09_23::StrSplit<char const*>(absl::lts_2020_09_23::strings_internal::ConvertibleToStringView, char const*) in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::operator<<(std::__1::basic_ostream<char, std::__1::char_traits<char> >&, absl::lts_2020_09_23::string_view)", referenced from:
mediapipe::NormalizedLandmarkList const& mediapipe::Packet::Get<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::ByString::Find(absl::lts_2020_09_23::string_view, unsigned long) const", referenced from:
absl::lts_2020_09_23::strings_internal::SplitIterator<absl::lts_2020_09_23::strings_internal::Splitter<absl::lts_2020_09_23::ByString, absl::lts_2020_09_23::AllowEmpty> >::operator++() in libHandTrackerLibrary.a(HandTracker.o)
ld: symbol(s) not found for architecture arm64
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
Error in child process '/usr/bin/xcrun'. 1
Target //mediapipe/develop:HandTracker failed to build
INFO: Elapsed time: 7.163s, Critical Path: 5.87s
INFO: 4 processes: 2 internal, 2 darwin-sandbox.
FAILED: Build did NOT complete successfully
============The Bazel script==========
load("@build_bazel_rules_apple//apple:ios.bzl", "ios_framework")
ios_framework(
name = "HandTracker",
hdrs = [
"HandTracker.h",
],
infoplists = ["Info.plist"],
bundle_id = "dev.noppe.HandTracker",
families = ["iphone", "ipad"],
minimum_os_version = "10.0",
deps = [
":HandTrackerLibrary",
"@ios_opencv//:OpencvFramework",
],
)
objc_library(
name = "HandTrackerLibrary",
srcs = [
"HandTracker.mm",
],
hdrs = [
"HandTracker.h",
],
data = [
"//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph",
"//mediapipe/modules/hand_landmark:hand_landmark.tflite",
"//mediapipe/modules/hand_landmark:handedness.txt",
"//mediapipe/modules/palm_detection:palm_detection.tflite",
"//mediapipe/modules/palm_detection:palm_detection_labelmap.txt",
],
sdk_frameworks = [
"AVFoundation",
"CoreGraphics",
"CoreMedia",
"UIKit"
],
deps = [
"//mediapipe/objc:mediapipe_framework_ios",
"//mediapipe/objc:mediapipe_input_sources_ios",
"//mediapipe/objc:mediapipe_layer_renderer",
] + select({
"//mediapipe:ios_i386": [],
"//mediapipe:ios_x86_64": [],
"//conditions:default": [
"//mediapipe/graphs/hand_tracking:mobile_calculators",
"//mediapipe/framework/formats:landmark_cc_proto",
],
}),
)
========The command I ran======
bazel build -c opt --config=ios_arm64 --verbose_failures mediapipe/develop:HandTracker
|
1.0
|
Errors to link absl. ld: symbol(s) not found for architecture arm64 - I tried to build a framework for iOS. But I got the following link errors.
Could you give me some suggestions to fix this issue? Thanks!
============Link errors==========
Undefined symbols for architecture arm64:
"absl::lts_2020_09_23::InternalError(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::strings_internal::CatPieces(std::initializer_list<absl::lts_2020_09_23::string_view>)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::InvalidArgumentError(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::Status mediapipe::Packet::ValidateAsType<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
absl::lts_2020_09_23::StatusOr<std::__1::vector<google::protobuf::MessageLite const*, std::__1::allocator<google::protobuf::MessageLite const*> > > mediapipe::packet_internal::ConvertToVectorOfProtoMessageLitePtrs<mediapipe::NormalizedLandmarkList>(mediapipe::NormalizedLandmarkList const*, std::__1::integral_constant<bool, false>) in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::ByString::ByString(absl::lts_2020_09_23::string_view)", referenced from:
absl::lts_2020_09_23::strings_internal::Splitter<absl::lts_2020_09_23::strings_internal::SelectDelimiter<char const*>::type, absl::lts_2020_09_23::AllowEmpty> absl::lts_2020_09_23::StrSplit<char const*>(absl::lts_2020_09_23::strings_internal::ConvertibleToStringView, char const*) in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::operator<<(std::__1::basic_ostream<char, std::__1::char_traits<char> >&, absl::lts_2020_09_23::string_view)", referenced from:
mediapipe::NormalizedLandmarkList const& mediapipe::Packet::Get<mediapipe::NormalizedLandmarkList>() const in libHandTrackerLibrary.a(HandTracker.o)
"absl::lts_2020_09_23::ByString::Find(absl::lts_2020_09_23::string_view, unsigned long) const", referenced from:
absl::lts_2020_09_23::strings_internal::SplitIterator<absl::lts_2020_09_23::strings_internal::Splitter<absl::lts_2020_09_23::ByString, absl::lts_2020_09_23::AllowEmpty> >::operator++() in libHandTrackerLibrary.a(HandTracker.o)
ld: symbol(s) not found for architecture arm64
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
Error in child process '/usr/bin/xcrun'. 1
Target //mediapipe/develop:HandTracker failed to build
INFO: Elapsed time: 7.163s, Critical Path: 5.87s
INFO: 4 processes: 2 internal, 2 darwin-sandbox.
FAILED: Build did NOT complete successfully
============The Bazel script==========
load("@build_bazel_rules_apple//apple:ios.bzl", "ios_framework")
ios_framework(
name = "HandTracker",
hdrs = [
"HandTracker.h",
],
infoplists = ["Info.plist"],
bundle_id = "dev.noppe.HandTracker",
families = ["iphone", "ipad"],
minimum_os_version = "10.0",
deps = [
":HandTrackerLibrary",
"@ios_opencv//:OpencvFramework",
],
)
objc_library(
name = "HandTrackerLibrary",
srcs = [
"HandTracker.mm",
],
hdrs = [
"HandTracker.h",
],
data = [
"//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph",
"//mediapipe/modules/hand_landmark:hand_landmark.tflite",
"//mediapipe/modules/hand_landmark:handedness.txt",
"//mediapipe/modules/palm_detection:palm_detection.tflite",
"//mediapipe/modules/palm_detection:palm_detection_labelmap.txt",
],
sdk_frameworks = [
"AVFoundation",
"CoreGraphics",
"CoreMedia",
"UIKit"
],
deps = [
"//mediapipe/objc:mediapipe_framework_ios",
"//mediapipe/objc:mediapipe_input_sources_ios",
"//mediapipe/objc:mediapipe_layer_renderer",
] + select({
"//mediapipe:ios_i386": [],
"//mediapipe:ios_x86_64": [],
"//conditions:default": [
"//mediapipe/graphs/hand_tracking:mobile_calculators",
"//mediapipe/framework/formats:landmark_cc_proto",
],
}),
)
========The command I ran======
bazel build -c opt --config=ios_arm64 --verbose_failures mediapipe/develop:HandTracker
|
non_process
|
errors to link absl ld symbol s not found for architecture i tried to build a framework for ios but i got the following link errors could you give me some suggestions to fix this issue thanks link errors undefined symbols for architecture absl lts internalerror absl lts string view referenced from absl lts status mediapipe packet validateastype const in libhandtrackerlibrary a handtracker o absl lts strings internal catpieces std initializer list referenced from absl lts status mediapipe packet validateastype const in libhandtrackerlibrary a handtracker o absl lts invalidargumenterror absl lts string view referenced from absl lts status mediapipe packet validateastype const in libhandtrackerlibrary a handtracker o absl lts statusor mediapipe packet internal converttovectorofprotomessageliteptrs mediapipe normalizedlandmarklist const std integral constant in libhandtrackerlibrary a handtracker o absl lts bystring bystring absl lts string view referenced from absl lts strings internal splitter type absl lts allowempty absl lts strsplit absl lts strings internal convertibletostringview char const in libhandtrackerlibrary a handtracker o absl lts operator absl lts string view referenced from mediapipe normalizedlandmarklist const mediapipe packet get const in libhandtrackerlibrary a handtracker o absl lts bystring find absl lts string view unsigned long const referenced from absl lts strings internal splititerator operator in libhandtrackerlibrary a handtracker o ld symbol s not found for architecture clang error linker command failed with exit code use v to see invocation error in child process usr bin xcrun target mediapipe develop handtracker failed to build info elapsed time critical path info processes internal darwin sandbox failed build did not complete successfully the bazel script load build bazel rules apple apple ios bzl ios framework ios framework name handtracker hdrs handtracker h infoplists bundle id dev noppe handtracker families minimum os version deps handtrackerlibrary ios opencv opencvframework objc library name handtrackerlibrary srcs handtracker mm hdrs handtracker h data mediapipe graphs hand tracking hand tracking mobile gpu binary graph mediapipe modules hand landmark hand landmark tflite mediapipe modules hand landmark handedness txt mediapipe modules palm detection palm detection tflite mediapipe modules palm detection palm detection labelmap txt sdk frameworks avfoundation coregraphics coremedia uikit deps mediapipe objc mediapipe framework ios mediapipe objc mediapipe input sources ios mediapipe objc mediapipe layer renderer select mediapipe ios mediapipe ios conditions default mediapipe graphs hand tracking mobile calculators mediapipe framework formats landmark cc proto the command i ran bazel build c opt config ios verbose failures mediapipe develop handtracker
| 0
|
16,508
| 21,511,879,897
|
IssuesEvent
|
2022-04-28 05:56:42
|
NationalSecurityAgency/ghidra
|
https://api.github.com/repos/NationalSecurityAgency/ghidra
|
closed
|
SLEIGH: How to Support Sin, Cos, and Sqrt?
|
Type: Question Feature: Processor/SuperH Feature: Sleigh
|
I would like to close https://github.com/NationalSecurityAgency/ghidra/issues/1730. I believe I have disassembly for fsca and fsrra working. I have no clue to how to implement the pseudo code (http://www.shared-ptr.com/sh_insns.html):
fsca:
```
void FSCA (int n)
{
if (FPSCR_PR != 0)
undefined_operation ();
else
{
float angle;
long offset = 0x00010000;
long fraction = 0x0000FFFF;
set_I ();
fraction &= FPUL; // extract sub-rotation (fraction) part
angle = fraction; // convert to float
angle = 2 * M_PI * angle / offset; // convert to radian
FR[n] = sin (angle);
FR[n+1] = cos (angle);
PC += 2;
}
}
```
fsrra:
```
void FSRRA (int n)
{
if (FPSCR_PR != 0)
undefined_operation ();
else
{
PC += 2;
clear_cause();
switch (data_type_of (n))
{
case NORM:
if (sign_of (n) == 0)
{
set_I ();
FR[n] = 1 / sqrt (FR[n]);
}
else
invalid (n);
break;
case DENORM:
if (sign_of (n) == 0)
fpu_error ();
else
invalid (n);
break;
case PZERO:
case NZERO:
dz (n, sign_of (n));
break;
case PINF:
FR[n] = 0;
break;
case NINF:
invalid (n);
break;
case qNAN:
qnan (n);
break;
case sNAN:
invalid (n);
break;
}
}
}
```
Any advice on how to tackle this? I found references in ia.sinc for pcodeops for cos. I don't quite understand it. Thanks in advance.
|
1.0
|
SLEIGH: How to Support Sin, Cos, and Sqrt? - I would like to close https://github.com/NationalSecurityAgency/ghidra/issues/1730. I believe I have disassembly for fsca and fsrra working. I have no clue to how to implement the pseudo code (http://www.shared-ptr.com/sh_insns.html):
fsca:
```
void FSCA (int n)
{
if (FPSCR_PR != 0)
undefined_operation ();
else
{
float angle;
long offset = 0x00010000;
long fraction = 0x0000FFFF;
set_I ();
fraction &= FPUL; // extract sub-rotation (fraction) part
angle = fraction; // convert to float
angle = 2 * M_PI * angle / offset; // convert to radian
FR[n] = sin (angle);
FR[n+1] = cos (angle);
PC += 2;
}
}
```
fsrra:
```
void FSRRA (int n)
{
if (FPSCR_PR != 0)
undefined_operation ();
else
{
PC += 2;
clear_cause();
switch (data_type_of (n))
{
case NORM:
if (sign_of (n) == 0)
{
set_I ();
FR[n] = 1 / sqrt (FR[n]);
}
else
invalid (n);
break;
case DENORM:
if (sign_of (n) == 0)
fpu_error ();
else
invalid (n);
break;
case PZERO:
case NZERO:
dz (n, sign_of (n));
break;
case PINF:
FR[n] = 0;
break;
case NINF:
invalid (n);
break;
case qNAN:
qnan (n);
break;
case sNAN:
invalid (n);
break;
}
}
}
```
Any advice on how to tackle this? I found references in ia.sinc for pcodeops for cos. I don't quite understand it. Thanks in advance.
|
process
|
sleigh how to support sin cos and sqrt i would like to close i believe i have disassembly for fsca and fsrra working i have no clue to how to implement the pseudo code fsca void fsca int n if fpscr pr undefined operation else float angle long offset long fraction set i fraction fpul extract sub rotation fraction part angle fraction convert to float angle m pi angle offset convert to radian fr sin angle fr cos angle pc fsrra void fsrra int n if fpscr pr undefined operation else pc clear cause switch data type of n case norm if sign of n set i fr sqrt fr else invalid n break case denorm if sign of n fpu error else invalid n break case pzero case nzero dz n sign of n break case pinf fr break case ninf invalid n break case qnan qnan n break case snan invalid n break any advice on how to tackle this i found references in ia sinc for pcodeops for cos i don t quite understand it thanks in advance
| 1
|
617,276
| 19,346,749,627
|
IssuesEvent
|
2021-12-15 11:38:18
|
numba/numba
|
https://api.github.com/repos/numba/numba
|
closed
|
API for Numba IR rewrite passes
|
Numba1.0 mediumpriority feature_request
|
There is currently no official API for adding rewrite passes to the compiler. We should fix that, as external projects may want to add support for various optimizations or custom code transformations.
|
1.0
|
API for Numba IR rewrite passes - There is currently no official API for adding rewrite passes to the compiler. We should fix that, as external projects may want to add support for various optimizations or custom code transformations.
|
non_process
|
api for numba ir rewrite passes there is currently no official api for adding rewrite passes to the compiler we should fix that as external projects may want to add support for various optimizations or custom code transformations
| 0
|
165,654
| 12,878,662,604
|
IssuesEvent
|
2020-07-11 17:43:58
|
Havret/activemq-artemis-extensions-aspnetcore
|
https://api.github.com/repos/Havret/activemq-artemis-extensions-aspnetcore
|
closed
|
Increase test coverage %
|
test
|
Not all public methods are covered with tests. There are some AddConsumer and AddProducer overloads that lack tests.
|
1.0
|
Increase test coverage % - Not all public methods are covered with tests. There are some AddConsumer and AddProducer overloads that lack tests.
|
non_process
|
increase test coverage not all public methods are covered with tests there are some addconsumer and addproducer overloads that lack tests
| 0
|
679,770
| 23,244,547,357
|
IssuesEvent
|
2022-08-03 18:45:36
|
ccmbioinfo/osmp
|
https://api.github.com/repos/ccmbioinfo/osmp
|
closed
|
Node: Replace session secret with environment variable
|
node high priority
|
The backend Express server currently uses a hard-coded secret for signing session cookies. This secret should be replaced with a randomly-generated secret pulled from an environment variable (eg. `SERVER_SESSION_SECRET`).
The scope of this issue includes adding the new environment variable, updating the `docker-compose*.yaml` files accordingly, updating the GitHub workflows where necessary, adding the randomly-generated secret to GitHub environments, and updating the line below to use the new environment secret in place of 'ssmp'.
https://github.com/ccmbioinfo/osmp/blob/0ecd7fa323126bd34a1daa33a08cfd82659abc65/server/src/server.ts#L30-L37
|
1.0
|
Node: Replace session secret with environment variable - The backend Express server currently uses a hard-coded secret for signing session cookies. This secret should be replaced with a randomly-generated secret pulled from an environment variable (eg. `SERVER_SESSION_SECRET`).
The scope of this issue includes adding the new environment variable, updating the `docker-compose*.yaml` files accordingly, updating the GitHub workflows where necessary, adding the randomly-generated secret to GitHub environments, and updating the line below to use the new environment secret in place of 'ssmp'.
https://github.com/ccmbioinfo/osmp/blob/0ecd7fa323126bd34a1daa33a08cfd82659abc65/server/src/server.ts#L30-L37
|
non_process
|
node replace session secret with environment variable the backend express server currently uses a hard coded secret for signing session cookies this secret should be replaced with a randomly generated secret pulled from an environment variable eg server session secret the scope of this issue includes adding the new environment variable updating the docker compose yaml files accordingly updating the github workflows where necessary adding the randomly generated secret to github environments and updating the line below to use the new environment secret in place of ssmp
| 0
|
4,754
| 7,613,521,605
|
IssuesEvent
|
2018-05-01 21:33:20
|
alexstone/timecard
|
https://api.github.com/repos/alexstone/timecard
|
closed
|
Create process to update `hours_logged` for Client models
|
backend process
|
When a `ClientProject` model is updated/created/deleted, a process should fire to update the `Client` `hours_logged` field.
|
1.0
|
Create process to update `hours_logged` for Client models - When a `ClientProject` model is updated/created/deleted, a process should fire to update the `Client` `hours_logged` field.
|
process
|
create process to update hours logged for client models when a clientproject model is updated created deleted a process should fire to update the client hours logged field
| 1
|
142,924
| 13,044,584,702
|
IssuesEvent
|
2020-07-29 05:09:51
|
datastax/cass-operator
|
https://api.github.com/repos/datastax/cass-operator
|
closed
|
validation fails if multiple watch namespaces
|
bug documentation enhancement
|
If modifying the WATCH_NAMESPACES var ensureWebhookConfigVolume and ensureWebhookCertificate fails.
The namespace var is imported from the WATCH_NAMESPACE env var here:
https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L97
While the use of multiple namespaces has been accounted for here https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L147
The two validations before this cause the container to fail.
https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L128
It might be better to add an additional env var e.g OPERATOR_NAMESPACE to read in and use here, setting it to the value of the operators namespace specifically.
Then use the watch_namespace var purely for its intended function.
|
1.0
|
validation fails if multiple watch namespaces - If modifying the WATCH_NAMESPACES var ensureWebhookConfigVolume and ensureWebhookCertificate fails.
The namespace var is imported from the WATCH_NAMESPACE env var here:
https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L97
While the use of multiple namespaces has been accounted for here https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L147
The two validations before this cause the container to fail.
https://github.com/datastax/cass-operator/blob/master/operator/cmd/manager/main.go#L128
It might be better to add an additional env var e.g OPERATOR_NAMESPACE to read in and use here, setting it to the value of the operators namespace specifically.
Then use the watch_namespace var purely for its intended function.
|
non_process
|
validation fails if multiple watch namespaces if modifying the watch namespaces var ensurewebhookconfigvolume and ensurewebhookcertificate fails the namespace var is imported from the watch namespace env var here while the use of multiple namespaces has been accounted for here the two validations before this cause the container to fail it might be better to add an additional env var e g operator namespace to read in and use here setting it to the value of the operators namespace specifically then use the watch namespace var purely for its intended function
| 0
|
161,348
| 6,122,873,371
|
IssuesEvent
|
2017-06-23 01:47:28
|
Azure/acs-engine
|
https://api.github.com/repos/Azure/acs-engine
|
closed
|
sshd_config messed with DC/OS template
|
needs more information orchestrator/dcos priority/P1
|
When creating a new cluster using DC/OS template, sshd_config is messed up resulting in the following lines in sshd_config file
Port 22Port 2222
This results in leader not being chosen and inability to log into master..
The JSON template generated has the following, which seems to be the issue.
\"sed -i \\\"s/^Port 22$/Port 22\\\\nPort 2222/1\\\" /etc/ssh/sshd_config\"
|
1.0
|
sshd_config messed with DC/OS template - When creating a new cluster using DC/OS template, sshd_config is messed up resulting in the following lines in sshd_config file
Port 22Port 2222
This results in leader not being chosen and inability to log into master..
The JSON template generated has the following, which seems to be the issue.
\"sed -i \\\"s/^Port 22$/Port 22\\\\nPort 2222/1\\\" /etc/ssh/sshd_config\"
|
non_process
|
sshd config messed with dc os template when creating a new cluster using dc os template sshd config is messed up resulting in the following lines in sshd config file port this results in leader not being chosen and inability to log into master the json template generated has the following which seems to be the issue sed i s port port nport etc ssh sshd config
| 0
|
222,978
| 24,711,514,174
|
IssuesEvent
|
2022-10-20 01:27:12
|
n-devs/freebitco.in-mobile
|
https://api.github.com/repos/n-devs/freebitco.in-mobile
|
opened
|
CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz
|
security vulnerability
|
## CVE-2022-37601 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz - ## CVE-2022-37601 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in loader utils tgz cve high severity vulnerability vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file freebitco in mobile package json path to vulnerable library node modules loader utils package json dependency hierarchy react scripts tgz root library webpack tgz x loader utils tgz vulnerable library vulnerability details prototype pollution vulnerability in function parsequery in parsequery js in webpack loader utils via the name variable in parsequery js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution loader utils direct dependency fix resolution react scripts step up your open source security game with mend
| 0
|
21,045
| 27,992,169,878
|
IssuesEvent
|
2023-03-27 05:17:25
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[attributeprocessor] Support sourcing attribute value from current timestamp
|
Stale processor/attributes closed as inactive
|
**Is your feature request related to a problem? Please describe.**
We would like to be able to add an attribute to traces/metrics/logs whose value is the time at which it passed through the collector.
**Describe the solution you'd like**
Implement this change in the AttrProc whose logic is shared by both the Attribute Processor and Resource Processor.
In addition to sourcing the attribute value from `value` or `from_attribute`, add the ability to source the value from the current timestamp with `from_time`. The resulting attribute is an integer attribute in seconds, milliseconds, or nanoseconds (configurable) since the unix epoch.
An example configuration would look like
```yaml
processors:
attributes:
actions:
- action: insert
key: ingress.time
from_time: milliseconds
```
In this case, the attribute ingress.time will be inserted as the current timestamp in milliseconds since the unix epoch.
We have a custom processor that implements this feature which we are willing to contribute.
|
1.0
|
[attributeprocessor] Support sourcing attribute value from current timestamp - **Is your feature request related to a problem? Please describe.**
We would like to be able to add an attribute to traces/metrics/logs whose value is the time at which it passed through the collector.
**Describe the solution you'd like**
Implement this change in the AttrProc whose logic is shared by both the Attribute Processor and Resource Processor.
In addition to sourcing the attribute value from `value` or `from_attribute`, add the ability to source the value from the current timestamp with `from_time`. The resulting attribute is an integer attribute in seconds, milliseconds, or nanoseconds (configurable) since the unix epoch.
An example configuration would look like
```yaml
processors:
attributes:
actions:
- action: insert
key: ingress.time
from_time: milliseconds
```
In this case, the attribute ingress.time will be inserted as the current timestamp in milliseconds since the unix epoch.
We have a custom processor that implements this feature which we are willing to contribute.
|
process
|
support sourcing attribute value from current timestamp is your feature request related to a problem please describe we would like to be able to add an attribute to traces metrics logs whose value is the time at which it passed through the collector describe the solution you d like implement this change in the attrproc whose logic is shared by both the attribute processor and resource processor in addition to sourcing the attribute value from value or from attribute add the ability to source the value from the current timestamp with from time the resulting attribute is an integer attribute in seconds milliseconds or nanoseconds configurable since the unix epoch an example configuration would look like yaml processors attributes actions action insert key ingress time from time milliseconds in this case the attribute ingress time will be inserted as the current timestamp in milliseconds since the unix epoch we have a custom processor that implements this feature which we are willing to contribute
| 1
|
11,159
| 13,957,693,814
|
IssuesEvent
|
2020-10-24 08:11:05
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
LU: Missing resources in Geoportal
|
Geoportal Harvesting process LU - Luxembourg
|
Collected from the Geoportal Workshop online survey answers:
Sometimes we get the "INSPIRE_SPATIAL_OBJECT_IS_AVAILABLE" error: See for example our dataset "INSPIRE - Annex I Theme Transport Networks - Roads - RoadLink "
(http://catalog.inspire.geoportail.lu/geonetwork/srv/eng/catalog.search#/metadata/57e7e401-8513-4562-bffe-78c26f5a2df7) and the report (http://inspire-geoportal.ec.europa.eu/resources/INSPIRE-93ee1068-1dc3-11e7-a02d-52540023a883_20181127-150506/services/1/PullResults/221-240/datasets/4/resourceReport/) which only gives a 87.5% on "interoperability". The GML does contain the right feature type. Perhaps the dataset is too large? Or because the dataset is zipped?
|
1.0
|
LU: Missing resources in Geoportal - Collected from the Geoportal Workshop online survey answers:
Sometimes we get the "INSPIRE_SPATIAL_OBJECT_IS_AVAILABLE" error: See for example our dataset "INSPIRE - Annex I Theme Transport Networks - Roads - RoadLink "
(http://catalog.inspire.geoportail.lu/geonetwork/srv/eng/catalog.search#/metadata/57e7e401-8513-4562-bffe-78c26f5a2df7) and the report (http://inspire-geoportal.ec.europa.eu/resources/INSPIRE-93ee1068-1dc3-11e7-a02d-52540023a883_20181127-150506/services/1/PullResults/221-240/datasets/4/resourceReport/) which only gives a 87.5% on "interoperability". The GML does contain the right feature type. Perhaps the dataset is too large? Or because the dataset is zipped?
|
process
|
lu missing resources in geoportal collected from the geoportal workshop online survey answers sometimes we get the quot inspire spatial object is available quot error see for example our dataset quot inspire annex i theme transport networks roads roadlink quot and the report which only gives a on quot interoperability quot the gml does contain the right feature type perhaps the dataset is too large or because the dataset is zipped
| 1
|
10,803
| 13,609,287,946
|
IssuesEvent
|
2020-09-23 04:50:10
|
googleapis/java-recommendations-ai
|
https://api.github.com/repos/googleapis/java-recommendations-ai
|
closed
|
Dependency Dashboard
|
api: recommendationengine type: process
|
This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-recommendations-ai-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-recommendations-ai to v0.3.1
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-recommendations-ai-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-recommendations-ai to v0.3.1
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any chore deps update dependency com google cloud google cloud recommendations ai to check this box to trigger a request for renovate to run again on this repository
| 1
|
15,917
| 2,869,094,825
|
IssuesEvent
|
2015-06-05 23:17:35
|
dart-lang/sdk
|
https://api.github.com/repos/dart-lang/sdk
|
closed
|
TestCase location hint
|
Area-Pkg Pkg-Unittest Priority-Medium Triaged Type-Defect
|
*This issue was originally filed by fedor.kor...@gmail.com*
_____
Add locationHint field with location of the Test.
|
1.0
|
TestCase location hint - *This issue was originally filed by fedor.kor...@gmail.com*
_____
Add locationHint field with location of the Test.
|
non_process
|
testcase location hint this issue was originally filed by fedor kor gmail com add locationhint field with location of the test
| 0
|
18,299
| 24,412,161,918
|
IssuesEvent
|
2022-10-05 13:16:09
|
Project60/org.project60.sepa
|
https://api.github.com/repos/Project60/org.project60.sepa
|
closed
|
Payment processor test mode unstable
|
wontfix payment processor
|
Sometimes, the payment processor seems to produce a "Expected one SepaMandate but found 0" error while in test mode. Observed on CiviCRM 4.5.6 with today's master.
|
1.0
|
Payment processor test mode unstable - Sometimes, the payment processor seems to produce a "Expected one SepaMandate but found 0" error while in test mode. Observed on CiviCRM 4.5.6 with today's master.
|
process
|
payment processor test mode unstable sometimes the payment processor seems to produce a expected one sepamandate but found error while in test mode observed on civicrm with today s master
| 1
|
17,309
| 23,128,658,346
|
IssuesEvent
|
2022-07-28 08:24:59
|
threefoldtech/tfgrid_dashboard
|
https://api.github.com/repos/threefoldtech/tfgrid_dashboard
|
closed
|
TF Explorer Farms drop-down menus
|
process_wontfix type_bug
|
Tested on https://explorer.test.grid.tf
On the farms page in he explorer all the fields for the filters except for the **Filter By Certification Type** have non working drop down menus and you have to enter the data manually unlike the nodes page.
### Nodes Page
In the nodes page all the filters are using a drop-down menus for suggestions and ease of use except for the Public IP Filter which is requited to be entered manually.

### Farms Page
In the farms page all the filters are required to be entered manually except for the certification type filter


|
1.0
|
TF Explorer Farms drop-down menus - Tested on https://explorer.test.grid.tf
On the farms page in he explorer all the fields for the filters except for the **Filter By Certification Type** have non working drop down menus and you have to enter the data manually unlike the nodes page.
### Nodes Page
In the nodes page all the filters are using a drop-down menus for suggestions and ease of use except for the Public IP Filter which is requited to be entered manually.

### Farms Page
In the farms page all the filters are required to be entered manually except for the certification type filter


|
process
|
tf explorer farms drop down menus tested on on the farms page in he explorer all the fields for the filters except for the filter by certification type have non working drop down menus and you have to enter the data manually unlike the nodes page nodes page in the nodes page all the filters are using a drop down menus for suggestions and ease of use except for the public ip filter which is requited to be entered manually farms page in the farms page all the filters are required to be entered manually except for the certification type filter
| 1
|
17,930
| 5,534,499,061
|
IssuesEvent
|
2017-03-21 15:31:00
|
mozilla/addons-frontend
|
https://api.github.com/repos/mozilla/addons-frontend
|
closed
|
Load noscript styles through webpack
|
component: code quality triaged
|
I started trying to load the noscript styles through webpack but unfortunately it looks like we can't get the raw CSS bundle from webpack-isomorphic-tools. There may be a way to do it but it seems difficult/unlikely.
If we support webpack 2 we could likely switch to universal-webpack [1] which is by the same author and seems to integrate with webpack a little better. They provide documentation on how to avoid the flash of unstyled content that we have in development mode and I believe we could use a similar technique to extract the CSS bundle for noscript styles.
Initial work on this is in #1594.
[1] https://github.com/halt-hammerzeit/universal-webpack
|
1.0
|
Load noscript styles through webpack - I started trying to load the noscript styles through webpack but unfortunately it looks like we can't get the raw CSS bundle from webpack-isomorphic-tools. There may be a way to do it but it seems difficult/unlikely.
If we support webpack 2 we could likely switch to universal-webpack [1] which is by the same author and seems to integrate with webpack a little better. They provide documentation on how to avoid the flash of unstyled content that we have in development mode and I believe we could use a similar technique to extract the CSS bundle for noscript styles.
Initial work on this is in #1594.
[1] https://github.com/halt-hammerzeit/universal-webpack
|
non_process
|
load noscript styles through webpack i started trying to load the noscript styles through webpack but unfortunately it looks like we can t get the raw css bundle from webpack isomorphic tools there may be a way to do it but it seems difficult unlikely if we support webpack we could likely switch to universal webpack which is by the same author and seems to integrate with webpack a little better they provide documentation on how to avoid the flash of unstyled content that we have in development mode and i believe we could use a similar technique to extract the css bundle for noscript styles initial work on this is in
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.