Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1
value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3
values | title stringlengths 1 1k | labels stringlengths 4 1.38k | body stringlengths 1 262k | index stringclasses 16
values | text_combine stringlengths 96 262k | label stringclasses 2
values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
168,314 | 14,145,530,523 | IssuesEvent | 2020-11-10 17:52:22 | noah-francis/SpaceNet_Software | https://api.github.com/repos/noah-francis/SpaceNet_Software | closed | Restart Versions? | documentation question | like:
*_V_3_1.mlapp --> *_V_1_1.mlapp
and just start from there again? | 1.0 | Restart Versions? - like:
*_V_3_1.mlapp --> *_V_1_1.mlapp
and just start from there again? | non_priority | restart versions like v mlapp v mlapp and just start from there again | 0 |
214,528 | 7,274,151,125 | IssuesEvent | 2018-02-21 08:59:00 | STEP-tw/battleship-phoenix | https://api.github.com/repos/STEP-tw/battleship-phoenix | closed | View settings | Low Priority small | As a _player_
I want to _have settings_
So that I may personalize language and settings_
**Additional Details**
User has landed on homepage.
Settings should be below about game option
**Acceptance Criteria**
- [x] Criteria 1
- Given _settings option_
- When _I click on settings_
- Then _I should see settings box with music, sound, language and rate_
- [x] Criteria 2
- Given _settings option_
- When _I have done setting_
- Then _I should see a cancel button at right top corner of settings box_
| 1.0 | View settings - As a _player_
I want to _have settings_
So that I may personalize language and settings_
**Additional Details**
User has landed on homepage.
Settings should be below about game option
**Acceptance Criteria**
- [x] Criteria 1
- Given _settings option_
- When _I click on settings_
- Then _I should see settings box with music, sound, language and rate_
- [x] Criteria 2
- Given _settings option_
- When _I have done setting_
- Then _I should see a cancel button at right top corner of settings box_
| priority | view settings as a player i want to have settings so that i may personalize language and settings additional details user has landed on homepage settings should be below about game option acceptance criteria criteria given settings option when i click on settings then i should see settings box with music sound language and rate criteria given settings option when i have done setting then i should see a cancel button at right top corner of settings box | 1 |
201,690 | 15,218,190,837 | IssuesEvent | 2021-02-17 17:30:29 | hazelcast/hazelcast | https://api.github.com/repos/hazelcast/hazelcast | closed | QueueSplitBrainProtectionReadTest element_splitBrainProtection[splitBrainProtectionType:READ] | Source: Internal Team: Core Type: Test-Failure | - Fails on `Hazelcast-4.master-sonar`
- Fails on [Build #698 (Nov 24, 2020 9:17:00 PM)](http://jenkins.hazelcast.com/view/Official%20Builds/job/Hazelcast-4.master-sonar/698/testReport/junit/com.hazelcast.splitbrainprotection.queue/QueueSplitBrainProtectionReadTest/element_splitBrainProtection_splitBrainProtectionType_READ_/)
- Error
```
Queue is empty!
```
- Stacktrace
```
java.util.NoSuchElementException: Queue is empty!
at com.hazelcast.collection.impl.queue.QueueProxyImpl.element(QueueProxyImpl.java:163)
at com.hazelcast.splitbrainprotection.queue.QueueSplitBrainProtectionReadTest.element_splitBrainProtection(QueueSplitBrainProtectionReadTest.java:70)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:114)
at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:106)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.lang.Thread.run(Thread.java:834)
```
- Hiccups
```
Hiccups measured while running test 'element_splitBrainProtection[splitBrainProtectionType:READ](com.hazelcast.splitbrainprotection.queue.QueueSplitBrainProtectionReadTest):'
22:44:00, accumulated pauses: 314 ms, max pause: 257 ms, pauses over 1000 ms: 0
```
- kindly check, not recall failure on this one recently, I could not catch offending commit for below
```
Ensure that RestApiConfig configuration parsing doesn't overwrite (commit: 86ece7d) (detail)
Ensure that advanced network configuration parsing doesn't overwrite (commit: bbe23f8) (detail)
Exclude StaticLoggerBinder from sql jar (#17864) (commit: 6c71ca3) (detail)
Fix shade warnings during SQL module build (#17819) (commit: 27d9e1f) (detail)
``` | 1.0 | QueueSplitBrainProtectionReadTest element_splitBrainProtection[splitBrainProtectionType:READ] - - Fails on `Hazelcast-4.master-sonar`
- Fails on [Build #698 (Nov 24, 2020 9:17:00 PM)](http://jenkins.hazelcast.com/view/Official%20Builds/job/Hazelcast-4.master-sonar/698/testReport/junit/com.hazelcast.splitbrainprotection.queue/QueueSplitBrainProtectionReadTest/element_splitBrainProtection_splitBrainProtectionType_READ_/)
- Error
```
Queue is empty!
```
- Stacktrace
```
java.util.NoSuchElementException: Queue is empty!
at com.hazelcast.collection.impl.queue.QueueProxyImpl.element(QueueProxyImpl.java:163)
at com.hazelcast.splitbrainprotection.queue.QueueSplitBrainProtectionReadTest.element_splitBrainProtection(QueueSplitBrainProtectionReadTest.java:70)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:114)
at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:106)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.lang.Thread.run(Thread.java:834)
```
- Hiccups
```
Hiccups measured while running test 'element_splitBrainProtection[splitBrainProtectionType:READ](com.hazelcast.splitbrainprotection.queue.QueueSplitBrainProtectionReadTest):'
22:44:00, accumulated pauses: 314 ms, max pause: 257 ms, pauses over 1000 ms: 0
```
- kindly check, not recall failure on this one recently, I could not catch offending commit for below
```
Ensure that RestApiConfig configuration parsing doesn't overwrite (commit: 86ece7d) (detail)
Ensure that advanced network configuration parsing doesn't overwrite (commit: bbe23f8) (detail)
Exclude StaticLoggerBinder from sql jar (#17864) (commit: 6c71ca3) (detail)
Fix shade warnings during SQL module build (#17819) (commit: 27d9e1f) (detail)
``` | non_priority | queuesplitbrainprotectionreadtest element splitbrainprotection fails on hazelcast master sonar fails on error queue is empty stacktrace java util nosuchelementexception queue is empty at com hazelcast collection impl queue queueproxyimpl element queueproxyimpl java at com hazelcast splitbrainprotection queue queuesplitbrainprotectionreadtest element splitbrainprotection queuesplitbrainprotectionreadtest java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org junit runners model frameworkmethod runreflectivecall frameworkmethod java at org junit internal runners model reflectivecallable run reflectivecallable java at org junit runners model frameworkmethod invokeexplosively frameworkmethod java at org junit internal runners statements invokemethod evaluate invokemethod java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at java base java util concurrent futuretask run futuretask java at java base java lang thread run thread java hiccups hiccups measured while running test element splitbrainprotection com hazelcast splitbrainprotection queue queuesplitbrainprotectionreadtest accumulated pauses ms max pause ms pauses over ms kindly check not recall failure on this one recently i could not catch offending commit for below ensure that restapiconfig configuration parsing doesn t overwrite commit detail ensure that advanced network configuration parsing doesn t overwrite commit detail exclude staticloggerbinder from sql jar commit detail fix shade warnings during sql module build commit detail | 0 |
522,211 | 15,158,179,786 | IssuesEvent | 2021-02-12 00:31:08 | NOAA-GSL/MATS | https://api.github.com/repos/NOAA-GSL/MATS | closed | Adjust appearance of curve show/hide buttons | Priority: Medium Project: MATS Status: Closed Type: Feature | ---
Author Name: **molly.b.smith** (@mollybsmith-noaa)
Original Redmine Issue: 63802, https://vlab.ncep.noaa.gov/redmine/issues/63802
Original Date: 2019-05-10
Original Assignee: molly.b.smith
---
From the user feedback session:
*****Hide/show buttons: invert colors, background white, text color of curve. Add space between cure label and buttons.*****
| 1.0 | Adjust appearance of curve show/hide buttons - ---
Author Name: **molly.b.smith** (@mollybsmith-noaa)
Original Redmine Issue: 63802, https://vlab.ncep.noaa.gov/redmine/issues/63802
Original Date: 2019-05-10
Original Assignee: molly.b.smith
---
From the user feedback session:
*****Hide/show buttons: invert colors, background white, text color of curve. Add space between cure label and buttons.*****
| priority | adjust appearance of curve show hide buttons author name molly b smith mollybsmith noaa original redmine issue original date original assignee molly b smith from the user feedback session hide show buttons invert colors background white text color of curve add space between cure label and buttons | 1 |
9,509 | 29,145,126,091 | IssuesEvent | 2023-05-18 01:43:09 | MicaelJarniac/cookiecutter-python-project | https://api.github.com/repos/MicaelJarniac/cookiecutter-python-project | opened | Separate type-checking deps for Nox session | enhancement automation | Installing all dev deps takes too long for the type-checking session. | 1.0 | Separate type-checking deps for Nox session - Installing all dev deps takes too long for the type-checking session. | non_priority | separate type checking deps for nox session installing all dev deps takes too long for the type checking session | 0 |
219,135 | 16,819,213,279 | IssuesEvent | 2021-06-17 11:04:15 | yewstack/yew | https://api.github.com/repos/yewstack/yew | closed | Document `VRef` usage | documentation | We have an example of `VRef` usage here: https://github.com/yewstack/yew/blob/v0.12.0/examples/inner_html/src/lib.rs#L36 but we don't document this pattern anywhere. It's a common question, for example: https://gitter.im/yewframework/Lobby?at=5e5b905b8e04426dd016384a | 1.0 | Document `VRef` usage - We have an example of `VRef` usage here: https://github.com/yewstack/yew/blob/v0.12.0/examples/inner_html/src/lib.rs#L36 but we don't document this pattern anywhere. It's a common question, for example: https://gitter.im/yewframework/Lobby?at=5e5b905b8e04426dd016384a | non_priority | document vref usage we have an example of vref usage here but we don t document this pattern anywhere it s a common question for example | 0 |
586,138 | 17,570,600,695 | IssuesEvent | 2021-08-14 16:01:31 | tahmid02016/tahmid02016.github.io | https://api.github.com/repos/tahmid02016/tahmid02016.github.io | closed | Y অক্ষ বরাবর header-এর মার্জিন বৃদ্ধি করতে হবে। | enhancement priority:medium | কম্পিউটার স্ক্রিনে শীর্ষ অংশ আরও বড় দেখানোর জন্য Y অক্ষ বরাবর মার্জিন আরও বৃদ্ধি করতে হবে। | 1.0 | Y অক্ষ বরাবর header-এর মার্জিন বৃদ্ধি করতে হবে। - কম্পিউটার স্ক্রিনে শীর্ষ অংশ আরও বড় দেখানোর জন্য Y অক্ষ বরাবর মার্জিন আরও বৃদ্ধি করতে হবে। | priority | y অক্ষ বরাবর header এর মার্জিন বৃদ্ধি করতে হবে। কম্পিউটার স্ক্রিনে শীর্ষ অংশ আরও বড় দেখানোর জন্য y অক্ষ বরাবর মার্জিন আরও বৃদ্ধি করতে হবে। | 1 |
15,087 | 2,611,066,796 | IssuesEvent | 2015-02-27 00:31:13 | alistairreilly/andors-trail | https://api.github.com/repos/alistairreilly/andors-trail | closed | Ask to overwrite the savegame if the name differs. | auto-migrated Milestone-0.6.10 Priority-Medium Type-Enhancement | ```
qasur stated here:
http://andors.techby2guys.com/viewtopic.php?f=4&t=844
Can we get an "Are you sure?" button when you're about to overwrite an existing
saved file?
It should ask you if the character name being saved differs from the saved game
you're about to erase.
```
Original issue reported on code.google.com by `SamuelPl...@gmail.com` on 9 Jul 2011 at 7:22 | 1.0 | Ask to overwrite the savegame if the name differs. - ```
qasur stated here:
http://andors.techby2guys.com/viewtopic.php?f=4&t=844
Can we get an "Are you sure?" button when you're about to overwrite an existing
saved file?
It should ask you if the character name being saved differs from the saved game
you're about to erase.
```
Original issue reported on code.google.com by `SamuelPl...@gmail.com` on 9 Jul 2011 at 7:22 | priority | ask to overwrite the savegame if the name differs qasur stated here can we get an are you sure button when you re about to overwrite an existing saved file it should ask you if the character name being saved differs from the saved game you re about to erase original issue reported on code google com by samuelpl gmail com on jul at | 1 |
55,784 | 14,681,837,777 | IssuesEvent | 2020-12-31 14:30:44 | TykTechnologies/tyk-operator | https://api.github.com/repos/TykTechnologies/tyk-operator | closed | creating 2 ingress resources results in an api definition being deleted | defect ingress | Template
```
apiVersion: tyk.tyk.io/v1alpha1
kind: ApiDefinition
metadata:
name: myapideftemplate
labels:
template: "true"
spec:
name: foo
protocol: http
use_keyless: true
use_standard_auth: false
active: true
proxy:
target_url: http://example.com
strip_listen_path: true
version_data:
default_version: Default
not_versioned: true
versions:
Default:
name: Default
paths:
black_list: []
ignored: []
white_list: []
```
Ingress 1
```
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: httpbin-ingress1
annotations:
kubernetes.io/ingress.class: tyk
tyk.io/template: myapideftemplate
spec:
rules:
- host: httpbin1
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: httpbin
port:
number: 8000
```
Ingress 2:
```
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: httpbin-ingress2
annotations:
kubernetes.io/ingress.class: tyk
tyk.io/template: myapideftemplate
spec:
rules:
- host: httpbin2
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: httpbin
port:
number: 8000
```
When we apply them all - it appears that the 2nd ingress now overwrites the first.
```
$ k apply -f apidefinition_template.yaml
apidefinition.tyk.tyk.io/myapideftemplate created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
myapideftemplate http://example.com true
$ k apply -f ingress.yaml
ingress.networking.k8s.io/httpbin-ingress1 created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
default-httpbin-ingress1-e0bf94ae1 httpbin1 / http://httpbin.default.svc.cluster.local:8000 true
myapideftemplate http://example.com true
$ k apply -f ingress2.yaml
ingress.networking.k8s.io/httpbin-ingress2 created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
default-httpbin-ingress2-3e8322b04 httpbin2 / http://httpbin.default.svc.cluster.local:8000 true
myapideftemplate http://example.com true
$ k get ingress
NAME CLASS HOSTS ADDRESS PORTS AGE
httpbin-ingress1 <none> httpbin1 80 8m31s
httpbin-ingress2 <none> httpbin2 80 8m18s
```
https://user-images.githubusercontent.com/1465130/103411114-4b7a8480-4b66-11eb-8750-5e1a1eff4fbb.mp4
| 1.0 | creating 2 ingress resources results in an api definition being deleted - Template
```
apiVersion: tyk.tyk.io/v1alpha1
kind: ApiDefinition
metadata:
name: myapideftemplate
labels:
template: "true"
spec:
name: foo
protocol: http
use_keyless: true
use_standard_auth: false
active: true
proxy:
target_url: http://example.com
strip_listen_path: true
version_data:
default_version: Default
not_versioned: true
versions:
Default:
name: Default
paths:
black_list: []
ignored: []
white_list: []
```
Ingress 1
```
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: httpbin-ingress1
annotations:
kubernetes.io/ingress.class: tyk
tyk.io/template: myapideftemplate
spec:
rules:
- host: httpbin1
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: httpbin
port:
number: 8000
```
Ingress 2:
```
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: httpbin-ingress2
annotations:
kubernetes.io/ingress.class: tyk
tyk.io/template: myapideftemplate
spec:
rules:
- host: httpbin2
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: httpbin
port:
number: 8000
```
When we apply them all - it appears that the 2nd ingress now overwrites the first.
```
$ k apply -f apidefinition_template.yaml
apidefinition.tyk.tyk.io/myapideftemplate created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
myapideftemplate http://example.com true
$ k apply -f ingress.yaml
ingress.networking.k8s.io/httpbin-ingress1 created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
default-httpbin-ingress1-e0bf94ae1 httpbin1 / http://httpbin.default.svc.cluster.local:8000 true
myapideftemplate http://example.com true
$ k apply -f ingress2.yaml
ingress.networking.k8s.io/httpbin-ingress2 created
$ k get tykapis
NAME DOMAIN LISTENPATH PROXY.TARGETURL ENABLED
default-httpbin-ingress2-3e8322b04 httpbin2 / http://httpbin.default.svc.cluster.local:8000 true
myapideftemplate http://example.com true
$ k get ingress
NAME CLASS HOSTS ADDRESS PORTS AGE
httpbin-ingress1 <none> httpbin1 80 8m31s
httpbin-ingress2 <none> httpbin2 80 8m18s
```
https://user-images.githubusercontent.com/1465130/103411114-4b7a8480-4b66-11eb-8750-5e1a1eff4fbb.mp4
| non_priority | creating ingress resources results in an api definition being deleted template apiversion tyk tyk io kind apidefinition metadata name myapideftemplate labels template true spec name foo protocol http use keyless true use standard auth false active true proxy target url strip listen path true version data default version default not versioned true versions default name default paths black list ignored white list ingress apiversion networking io kind ingress metadata name httpbin annotations kubernetes io ingress class tyk tyk io template myapideftemplate spec rules host http paths path pathtype prefix backend service name httpbin port number ingress apiversion networking io kind ingress metadata name httpbin annotations kubernetes io ingress class tyk tyk io template myapideftemplate spec rules host http paths path pathtype prefix backend service name httpbin port number when we apply them all it appears that the ingress now overwrites the first k apply f apidefinition template yaml apidefinition tyk tyk io myapideftemplate created k get tykapis name domain listenpath proxy targeturl enabled myapideftemplate true k apply f ingress yaml ingress networking io httpbin created k get tykapis name domain listenpath proxy targeturl enabled default httpbin true myapideftemplate true k apply f yaml ingress networking io httpbin created k get tykapis name domain listenpath proxy targeturl enabled default httpbin true myapideftemplate true k get ingress name class hosts address ports age httpbin httpbin | 0 |
660,007 | 21,948,305,781 | IssuesEvent | 2022-05-24 04:44:35 | DeFiCh/wallet | https://api.github.com/repos/DeFiCh/wallet | closed | Price Rates - order of token display | triage/accepted kind/feature area/ui-ux priority/low | <!-- Please only use this template for submitting enhancement/feature requests -->
#### What would you like to be added:
The order of the token displayed in price rates should be aligned with https://defiscan.live/
Follow the order of the pairing. If it's dBTC-DFI, then dBTC should be displayed first.
Align with Scan and go with the non-DUSD/DFI token as the “primary” token. Some examples to illustrate:
**dBTC-DFI**
```
Price Rates:
dBTC = ...
DFI = ...
```
**dTLSA-DUSD**
```
Price Rates:
dTSLA = ...
DUSD = ...
```
**DUSD-DFI**
```
Price Rates:
DUSD = ...
DFI = ...
```

- [ ] Main DEX Screen
- [ ] Check on Swap page if applicable
- [ ] Check on Add Liquidity page if applicable
- [ ] Check on Remove Liquidity page if applicable
#### Why is this needed:
This works better if we want to move towards a proper DEX trading experience
| 1.0 | Price Rates - order of token display - <!-- Please only use this template for submitting enhancement/feature requests -->
#### What would you like to be added:
The order of the token displayed in price rates should be aligned with https://defiscan.live/
Follow the order of the pairing. If it's dBTC-DFI, then dBTC should be displayed first.
Align with Scan and go with the non-DUSD/DFI token as the “primary” token. Some examples to illustrate:
**dBTC-DFI**
```
Price Rates:
dBTC = ...
DFI = ...
```
**dTLSA-DUSD**
```
Price Rates:
dTSLA = ...
DUSD = ...
```
**DUSD-DFI**
```
Price Rates:
DUSD = ...
DFI = ...
```

- [ ] Main DEX Screen
- [ ] Check on Swap page if applicable
- [ ] Check on Add Liquidity page if applicable
- [ ] Check on Remove Liquidity page if applicable
#### Why is this needed:
This works better if we want to move towards a proper DEX trading experience
| priority | price rates order of token display what would you like to be added the order of the token displayed in price rates should be aligned with follow the order of the pairing if it s dbtc dfi then dbtc should be displayed first align with scan and go with the non dusd dfi token as the “primary” token some examples to illustrate dbtc dfi price rates dbtc dfi dtlsa dusd price rates dtsla dusd dusd dfi price rates dusd dfi main dex screen check on swap page if applicable check on add liquidity page if applicable check on remove liquidity page if applicable why is this needed this works better if we want to move towards a proper dex trading experience | 1 |
277,031 | 24,042,810,408 | IssuesEvent | 2022-09-16 04:43:47 | Tencent/bk-ci | https://api.github.com/repos/Tencent/bk-ci | closed | bug:插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本 | kind/bug for gray for test done area/ci/backend tested | **问题:** 插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本
**解决方案**: 应该让agent下载到最新已发布的版本 | 2.0 | bug:插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本 - **问题:** 插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本
**解决方案**: 应该让agent下载到最新已发布的版本 | non_priority | bug 插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本 问题: 插件大版本中最新的版本下架的情况会导致agent下载到该已下架的版本 解决方案 : 应该让agent下载到最新已发布的版本 | 0 |
464,713 | 13,338,552,876 | IssuesEvent | 2020-08-28 11:15:36 | RxJellyBot/Jelly-Bot | https://api.github.com/repos/RxJellyBot/Jelly-Bot | closed | Message Stats - Top Active Member Message Count | Priority: 7 Type: Task | List the daily message count of the below members in the corresponding section.
- Top 3
- Top 10
- Top 20
These tops could have different member for each days.
For example:
- 1st day Top 3: A B C
- 2nd day Top 3: A B D | 1.0 | Message Stats - Top Active Member Message Count - List the daily message count of the below members in the corresponding section.
- Top 3
- Top 10
- Top 20
These tops could have different member for each days.
For example:
- 1st day Top 3: A B C
- 2nd day Top 3: A B D | priority | message stats top active member message count list the daily message count of the below members in the corresponding section top top top these tops could have different member for each days for example day top a b c day top a b d | 1 |
65,490 | 7,882,454,734 | IssuesEvent | 2018-06-26 22:49:02 | quicwg/base-drafts | https://api.github.com/repos/quicwg/base-drafts | closed | CFIN only recoverable with timeout | -tls -transport design stream0 | If we're sending CFIN and 1-RTT data in the same flight, and only the CFIN is lost, the client can only use the handshake timeout to recover the FIN and not any threshold based recovery.
We don't allow the server to use the 1-RTT keys until it receives the CFIN. This makes it so that even if the server has received 1-RTT data it cannot ack it which would have triggered threshold loss recovery. So the client has to rely on the handshake timeout to send the data.
Are we fine with saying that the client should have a way more aggressive loss timeout for handshake packets than the server should? | 1.0 | CFIN only recoverable with timeout - If we're sending CFIN and 1-RTT data in the same flight, and only the CFIN is lost, the client can only use the handshake timeout to recover the FIN and not any threshold based recovery.
We don't allow the server to use the 1-RTT keys until it receives the CFIN. This makes it so that even if the server has received 1-RTT data it cannot ack it which would have triggered threshold loss recovery. So the client has to rely on the handshake timeout to send the data.
Are we fine with saying that the client should have a way more aggressive loss timeout for handshake packets than the server should? | non_priority | cfin only recoverable with timeout if we re sending cfin and rtt data in the same flight and only the cfin is lost the client can only use the handshake timeout to recover the fin and not any threshold based recovery we don t allow the server to use the rtt keys until it receives the cfin this makes it so that even if the server has received rtt data it cannot ack it which would have triggered threshold loss recovery so the client has to rely on the handshake timeout to send the data are we fine with saying that the client should have a way more aggressive loss timeout for handshake packets than the server should | 0 |
264,157 | 19,989,485,517 | IssuesEvent | 2022-01-31 03:24:18 | eslutz/Space-Adventure-Text-Game | https://api.github.com/repos/eslutz/Space-Adventure-Text-Game | reopened | Update/Add Documentation | documentation | Create or update the following files:
- [ ] README
- [ ] CHANGELOG
- [ ] SECURITY
- [ ] CODEOWNER
- [x] Issue Templates | 1.0 | Update/Add Documentation - Create or update the following files:
- [ ] README
- [ ] CHANGELOG
- [ ] SECURITY
- [ ] CODEOWNER
- [x] Issue Templates | non_priority | update add documentation create or update the following files readme changelog security codeowner issue templates | 0 |
559,003 | 16,547,269,920 | IssuesEvent | 2021-05-28 02:36:47 | swlegion/tts | https://api.github.com/repos/swlegion/tts | closed | Issues to fix for re-landing Spawnv2 | priority 1: high 🐛 bug | Make sure to test with this [`cis-spam.json`](https://gist.githubusercontent.com/matanlurey/8fcf4fc55250cc1b68eabdb093a0517c/raw/fe6d79f78dd469cafd8420f8fad32618bbfaeb6b/cis-spam.json).
- Add collider data for all of the assetbundle-based minis
- Larger units (>5 minis?) spawn models stacked
- Units not spawning at all in the back row
- When they do spawn, models are being scattered across the x-axis | 1.0 | Issues to fix for re-landing Spawnv2 - Make sure to test with this [`cis-spam.json`](https://gist.githubusercontent.com/matanlurey/8fcf4fc55250cc1b68eabdb093a0517c/raw/fe6d79f78dd469cafd8420f8fad32618bbfaeb6b/cis-spam.json).
- Add collider data for all of the assetbundle-based minis
- Larger units (>5 minis?) spawn models stacked
- Units not spawning at all in the back row
- When they do spawn, models are being scattered across the x-axis | priority | issues to fix for re landing make sure to test with this add collider data for all of the assetbundle based minis larger units minis spawn models stacked units not spawning at all in the back row when they do spawn models are being scattered across the x axis | 1 |
224,646 | 17,767,048,199 | IssuesEvent | 2021-08-30 08:53:28 | ckeditor/ckeditor4 | https://api.github.com/repos/ckeditor/ckeditor4 | closed | Link and image are not displayed properly by print plugin | type:bug status:confirmed type:failingtest status:blocked | ## Type of report
Bug
## Provide detailed reproduction steps (if any)
1. Go to `/tests/plugins/print/manual/print`
2. Click `Print` button
3. Check out preview
### Expected result
Preview content matches editor content exactly.
### Actual result
9/10 tries image is missing and link is displayed as plain text.
## Other details
In Edge Chromium it works fine if you click `Preview` button first and don't close the preview window. This workaround doesn't work for Chrome though. Also it seemed to work in `4.13.1` (check it out [here](http://cdn.ckeditor.com/4.13.1/full-all/samples/)).
* Browser: Chrome, Edge Chromium
* OS: macOS
* CKEditor version: `4.14.0`
* Installed CKEditor plugins: `print`, `preview`
| 1.0 | Link and image are not displayed properly by print plugin - ## Type of report
Bug
## Provide detailed reproduction steps (if any)
1. Go to `/tests/plugins/print/manual/print`
2. Click `Print` button
3. Check out preview
### Expected result
Preview content matches editor content exactly.
### Actual result
9/10 tries image is missing and link is displayed as plain text.
## Other details
In Edge Chromium it works fine if you click `Preview` button first and don't close the preview window. This workaround doesn't work for Chrome though. Also it seemed to work in `4.13.1` (check it out [here](http://cdn.ckeditor.com/4.13.1/full-all/samples/)).
* Browser: Chrome, Edge Chromium
* OS: macOS
* CKEditor version: `4.14.0`
* Installed CKEditor plugins: `print`, `preview`
| non_priority | link and image are not displayed properly by print plugin type of report bug provide detailed reproduction steps if any go to tests plugins print manual print click print button check out preview expected result preview content matches editor content exactly actual result tries image is missing and link is displayed as plain text other details in edge chromium it works fine if you click preview button first and don t close the preview window this workaround doesn t work for chrome though also it seemed to work in check it out browser chrome edge chromium os macos ckeditor version installed ckeditor plugins print preview | 0 |
269,531 | 23,447,627,793 | IssuesEvent | 2022-08-15 21:26:07 | common-fate/granted-approvals | https://api.github.com/repos/common-fate/granted-approvals | opened | Add tests for snowflake | tests | Currently snowflake is implemented and functional, however we do not have adequate tests in place.
https://github.com/common-fate/granted-approvals/commit/a2c38f95e9ca835f5be342de3b50141124962191 | 1.0 | Add tests for snowflake - Currently snowflake is implemented and functional, however we do not have adequate tests in place.
https://github.com/common-fate/granted-approvals/commit/a2c38f95e9ca835f5be342de3b50141124962191 | non_priority | add tests for snowflake currently snowflake is implemented and functional however we do not have adequate tests in place | 0 |
118,679 | 17,598,790,955 | IssuesEvent | 2021-08-17 09:11:30 | ghc-dev/Gary-Parks | https://api.github.com/repos/ghc-dev/Gary-Parks | opened | CVE-2021-21290 (Medium) detected in netty-handler-4.1.39.Final.jar, netty-codec-http-4.1.39.Final.jar | security vulnerability | ## CVE-2021-21290 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-handler-4.1.39.Final.jar</b>, <b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-handler-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: Gary-Parks/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.39.Final/4a63b56de071c1b10a56b5d90095e4201ea4098f/netty-handler-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- netty-codec-http-4.1.39.Final.jar (Root Library)
- :x: **netty-handler-4.1.39.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: Gary-Parks/build.gradle</p>
<p>Path to vulnerable library: dle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.39.Final/732d06961162e27fa3ae5989541c4460853745d3/netty-codec-http-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.39.Final.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Gary-Parks/commit/a6eada4e562b47ba8906f120128407f27194a69d">a6eada4e562b47ba8906f120128407f27194a69d</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty's multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method "File.createTempFile" on unix-like systems creates a random file, but, by default will create this file with the permissions "-rw-r--r--". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty's "AbstractDiskHttpData" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own "java.io.tmpdir" when you start the JVM or use "DefaultHttpDataFactory.setBaseDir(...)" to set the directory to something that is only readable by the current user.
<p>Publish Date: 2021-02-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290>CVE-2021-21290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2">https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2</a></p>
<p>Release Date: 2021-02-08</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.59.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-handler","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final;io.netty:netty-handler:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21290","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty\u0027s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method \"File.createTempFile\" on unix-like systems creates a random file, but, by default will create this file with the permissions \"-rw-r--r--\". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty\u0027s \"AbstractDiskHttpData\" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own \"java.io.tmpdir\" when you start the JVM or use \"DefaultHttpDataFactory.setBaseDir(...)\" to set the directory to something that is only readable by the current user.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2021-21290 (Medium) detected in netty-handler-4.1.39.Final.jar, netty-codec-http-4.1.39.Final.jar - ## CVE-2021-21290 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-handler-4.1.39.Final.jar</b>, <b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-handler-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: Gary-Parks/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.39.Final/4a63b56de071c1b10a56b5d90095e4201ea4098f/netty-handler-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- netty-codec-http-4.1.39.Final.jar (Root Library)
- :x: **netty-handler-4.1.39.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: Gary-Parks/build.gradle</p>
<p>Path to vulnerable library: dle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.39.Final/732d06961162e27fa3ae5989541c4460853745d3/netty-codec-http-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.39.Final.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Gary-Parks/commit/a6eada4e562b47ba8906f120128407f27194a69d">a6eada4e562b47ba8906f120128407f27194a69d</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty's multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method "File.createTempFile" on unix-like systems creates a random file, but, by default will create this file with the permissions "-rw-r--r--". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty's "AbstractDiskHttpData" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own "java.io.tmpdir" when you start the JVM or use "DefaultHttpDataFactory.setBaseDir(...)" to set the directory to something that is only readable by the current user.
<p>Publish Date: 2021-02-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290>CVE-2021-21290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2">https://github.com/netty/netty/security/advisories/GHSA-5mcr-gq6c-3hq2</a></p>
<p>Release Date: 2021-02-08</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.59.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-handler","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final;io.netty:netty-handler:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.59.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-21290","vulnerabilityDetails":"Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers \u0026 clients. In Netty before version 4.1.59.Final there is a vulnerability on Unix-like systems involving an insecure temp file. When netty\u0027s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method \"File.createTempFile\" on unix-like systems creates a random file, but, by default will create this file with the permissions \"-rw-r--r--\". Thus, if sensitive information is written to this file, other local users can read this information. This is the case in netty\u0027s \"AbstractDiskHttpData\" is vulnerable. This has been fixed in version 4.1.59.Final. As a workaround, one may specify your own \"java.io.tmpdir\" when you start the JVM or use \"DefaultHttpDataFactory.setBaseDir(...)\" to set the directory to something that is only readable by the current user.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21290","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve medium detected in netty handler final jar netty codec http final jar cve medium severity vulnerability vulnerable libraries netty handler final jar netty codec http final jar netty handler final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file gary parks build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty handler final netty handler final jar dependency hierarchy netty codec http final jar root library x netty handler final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file gary parks build gradle path to vulnerable library dle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library found in head commit a href found in base branch master vulnerability details netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty before version final there is a vulnerability on unix like systems involving an insecure temp file when netty s multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled on unix like systems the temporary directory is shared between all user as such writing to this directory using apis that do not explicitly set the file directory permissions can lead to information disclosure of note this does not impact modern macos operating systems the method file createtempfile on unix like systems creates a random file but by default will create this file with the permissions rw r r thus if sensitive information is written to this file other local users can read this information this is the case in netty s abstractdiskhttpdata is vulnerable this has been fixed in version final as a workaround one may specify your own java io tmpdir when you start the jvm or use defaulthttpdatafactory setbasedir to set the directory to something that is only readable by the current user publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http final isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree io netty netty codec http final io netty netty handler final isminimumfixversionavailable true minimumfixversion io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty codec http final basebranches vulnerabilityidentifier cve vulnerabilitydetails netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty before version final there is a vulnerability on unix like systems involving an insecure temp file when netty multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled on unix like systems the temporary directory is shared between all user as such writing to this directory using apis that do not explicitly set the file directory permissions can lead to information disclosure of note this does not impact modern macos operating systems the method file createtempfile on unix like systems creates a random file but by default will create this file with the permissions rw r r thus if sensitive information is written to this file other local users can read this information this is the case in netty abstractdiskhttpdata is vulnerable this has been fixed in version final as a workaround one may specify your own java io tmpdir when you start the jvm or use defaulthttpdatafactory setbasedir to set the directory to something that is only readable by the current user vulnerabilityurl | 0 |
300,790 | 9,212,466,921 | IssuesEvent | 2019-03-10 00:58:14 | gravityview/GravityView | https://api.github.com/repos/gravityview/GravityView | closed | Editing an entry strips the labels from product calculation fields, nulls totals | Bug Core: Edit Entry Core: Fields Difficulty: Medium Priority: High | The labels get stripped from the receipt table after editing in Edit Entry; probably due to bad serialization?
Also look into whether it's the deleting of the entry meta on `GravityView_Field_Product::clear_product_info_cache()` method

See [HS#10931](https://secure.helpscout.net/conversation/430351819/10931/).
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/995529792029955/995656589921776)
| 1.0 | Editing an entry strips the labels from product calculation fields, nulls totals - The labels get stripped from the receipt table after editing in Edit Entry; probably due to bad serialization?
Also look into whether it's the deleting of the entry meta on `GravityView_Field_Product::clear_product_info_cache()` method

See [HS#10931](https://secure.helpscout.net/conversation/430351819/10931/).
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/995529792029955/995656589921776)
| priority | editing an entry strips the labels from product calculation fields nulls totals the labels get stripped from the receipt table after editing in edit entry probably due to bad serialization also look into whether it s the deleting of the entry meta on gravityview field product clear product info cache method see ┆issue is synchronized with this | 1 |
84,710 | 24,393,514,444 | IssuesEvent | 2022-10-04 17:07:22 | expo/expo | https://api.github.com/repos/expo/expo | closed | Create development build using EAS with Notifee - Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'. | needs review Development Builds | ### Summary
Create development build using EAS.
I've followed the [ Notifee Expo support workflow ](https://notifee.app/react-native/docs/installation#expo-support) running
```
expo prebuild
eas build --profile development --platform android
```
with the latest the `expo-dev-client` package, `expo-cli`, `eas-cli` and Java 11
the build fails in run gradlew as follows:
```
Running 'gradlew :app:assembleDebug' in /home/expo/workingdir/build/android
Downloading https://services.gradle.org/distributions/gradle-7.3.3-all.zip
10%
20%.
30%
40%.
50%.
60
%.
70%
80%
90%.
100%
Welcome to Gradle 7.3.3!
Here are the highlights of this release:
- Easily declare new test suites in Java projects
- Support for Java 17
- Support for Scala 3
For more details see https://docs.gradle.org/7.3.3/release-notes.html
To honour the JVM settings for this build a single-use Daemon process will be forked. See https://docs.gradle.org/7.3.3/userguide/gradle_daemon.html#sec:disabling_the_daemon.
Daemon will be stopped at the end of the build
> Task :react-native-gradle-plugin:compileKotlin
'compileJava' task (current target is 1.8) and 'compileKotlin' task (current target is 11) jvm target compatibility should be set to the same Java version.
> Task :react-native-gradle-plugin:pluginDescriptors
> Task :react-native-gradle-plugin:processResources
> Task :react-native-gradle-plugin:compileKotlin
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (10, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (119, 30): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (135, 26): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (157, 32): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (163, 31): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (171, 36): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactPlugin.kt: (100, 48): 'reactRoot: DirectoryProperty' is deprecated. reactRoot was confusing and has been replace with root to point to your root project and reactNativeDir to point to the folder of the react-native NPM package
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (10, 37): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (11, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (12, 37): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (27, 51): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (130, 12): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (131, 12): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (251, 14): 'BaseVariant' is deprecated. Deprecated in Java
> Task :react-native-gradle-plugin:compileJava
[stderr] Note: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/java/com/facebook/react/codegen/generator/SchemaJsonParser.java uses or overrides a deprecated API.
[stderr] Note: Recompile with -Xlint:deprecation for details.
> Task :react-native-gradle-plugin:classes
> Task :react-native-gradle-plugin:inspectClassesForKotlinIC
> Task :react-native-gradle-plugin:jar
> Configure project :expo-application
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-background-fetch
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-constants
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-client
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-launcher
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu-interface
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-error-recovery
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-file-system
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-font
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-json-utils
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-keep-awake
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-manifests
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-modules-core
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
Checking the license for package NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/licenses
License for package NDK (Side by side) 21.4.7075529 accepted.
Preparing "Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)".
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" ready.
Installing NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/ndk/21.4.7075529
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" complete.
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" finished.
Checking the license for package CMake 3.18.1 in /home/expo/Android/Sdk/licenses
License for package CMake 3.18.1 accepted.
Preparing "Install CMake 3.18.1 (revision: 3.18.1)".
"Install CMake 3.18.1 (revision: 3.18.1)" ready.
Installing CMake 3.18.1 in /home/expo/Android/Sdk/cmake/3.18.1
"Install CMake 3.18.1 (revision: 3.18.1)" complete.
"Install CMake 3.18.1 (revision: 3.18.1)" finished.
> Configure project :expo-splash-screen
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-task-manager
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-updates-interface
Warning: The 'kotlin-android-extensions' Gradle plugin is deprecated. Please use this migration guide (https://goo.gle/kotlin-android-extensions-deprecation) to start working with View Binding (https://developer.android.com/topic/libraries/view-binding) and the 'kotlin-parcelize' plugin.
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :unimodules-app-loader
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo
Using expo modules
- expo-application (4.2.2)
- expo-background-fetch (10.3.0)
- expo-constants (13.2.4)
- expo-dev-client (1.3.0)
- expo-dev-launcher (1.3.0)
- expo-dev-menu (1.3.0)
- expo-error-recovery (3.2.0)
- expo-file-system (14.1.0)
- expo-font (10.2.1)
- expo-json-utils (0.3.0)
- expo-keep-awake (10.2.0)
- expo-manifests (0.3.1)
- expo-modules-core (0.11.5)
- expo-splash-screen (0.16.2)
- expo-task-manager (10.3.0)
- unimodules-app-loader (3.1.0)
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :notifee_react-native
:notifee_react-native @notifee/react-native found at /home/expo/workingdir/build/node_modules/@notifee/react-native
:notifee_react-native package.json found at /home/expo/workingdir/build/node_modules/@notifee/react-native/package.json
:notifee_react-native:version set from package.json: 7.0.1 (7,0,1 - 7000001)
:notifee_react-native:android.compileSdk using custom value: 46
:notifee_react-native:android.targetSdk using custom value: 31
:notifee_react-native:android.minSdk using custom value: 21
:notifee_react-native:reactNativeAndroidDir /home/expo/workingdir/build/node_modules/react-native/android
WARNING:We recommend using a newer Android Gradle plugin to use compileSdk = 46
This Android Gradle plugin (7.1.1) was tested up to compileSdk = 32
This warning can be suppressed by adding
android.suppressUnsupportedCompileSdk=46
to this project's gradle.properties
The build will continue, but you are strongly encouraged to update your project to
use a newer Android Gradle Plugin that has been tested with compileSdk = 46
Checking the license for package Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/licenses
License for package Android SDK Build-Tools 30.0.3 accepted.
Preparing "Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)".
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" ready.
Installing Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/build-tools/30.0.3
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" complete.
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" finished.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
[stderr] FAILURE: Build failed with an exception.
[stderr] * What went wrong:
[stderr] Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'.
[stderr] > Failed to find Platform SDK with path: platforms;android-46
[stderr] * Try:
[stderr] > Run with --stacktrace option to get the stack trace.
[stderr] > Run with --info or --debug option to get more log output.
[stderr] > Run with --scan to get full insights.
[stderr] * Get more help at https://help.gradle.org
See https://docs.gradle.org/7.3.3/userguide/command_line_interface.html#sec:command_line_warnings
6 actionable tasks: 6 executed
[stderr] BUILD FAILED in 3m 44s
Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.
```
eas.json:
```
{
"cli": {
"version": ">= 2.1.0"
},
"build": {
"development": {
"developmentClient": true,
"distribution": "internal",
"android": {
"image": "ubuntu-18.04-jdk-11-ndk-r19c"
}
}
}
}
```
package.json dependencies:
```
"dependencies": {
"@notifee/react-native": "^7.0.1",
"@react-native-async-storage/async-storage": "~1.17.3",
"expo": "~46.0.13",
"expo-background-fetch": "~10.3.0",
"expo-build-properties": "~0.3.0",
"expo-dev-client": "~1.3.0",
"expo-splash-screen": "~0.16.2",
"expo-status-bar": "~1.4.0",
"react": "18.0.0",
"react-dom": "18.0.0",
"react-native": "0.69.6",
"react-native-web": "~0.18.7"
},
```
app.json pluggins:
```
"plugins": [
"@notifee/react-native",
["expo-build-properties",
{
"android": {
"compileSdkVersion": 46
}
}]
],
```
### Managed or bare workflow?
managed
### What platform(s) does this occur on?
Android
### Package versions
```
"dependencies":` {
"@notifee/react-native": "^7.0.1",
"@react-native-async-storage/async-storage": "~1.17.3",
"expo": "~46.0.13",
"expo-background-fetch": "~10.3.0",
"expo-build-properties": "~0.3.0",
**"expo-dev-client": "~1.3.0",**
"expo-splash-screen": "~0.16.2",
"expo-status-bar": "~1.4.0",
**"expo-updates": "^0.14.6",**
"react": "18.0.0",
"react-dom": "18.0.0",
"react-native": "0.69.6",
"react-native-web": "~0.18.7"
},
```
### Environment
```
expo-env-info 1.0.5 environment info:
System:
OS: Windows 10 10.0.19044
Binaries:
Node: 14.17.0 - C:\Program Files\nodejs\node.EXE
npm: 8.1.2 - C:\Program Files\nodejs\npm.CMD
IDEs:
Android Studio: Version 4.2.0.0 AI-202.7660.26.42.7486908
npmPackages:
expo: ~46.0.13 => 46.0.13
react: 18.0.0 => 18.0.0
react-dom: 18.0.0 => 18.0.0
react-native: 0.69.6 => 0.69.6
react-native-web: ~0.18.7 => 0.18.9
Expo Workflow: bare
```
This only says bare as i've expo prebuild'd as per notifee's how to install, however I've not changed anything in the android folder
### Reproducible demo
https://github.com/TheFunEmbargo/QuotesOfNote
### Stacktrace (if a crash is involved)
```
Running 'gradlew :app:assembleDebug' in /home/expo/workingdir/build/android
Downloading https://services.gradle.org/distributions/gradle-7.3.3-all.zip
10%
20%.
30%
40%.
50%.
60
%.
70%
80%
90%.
100%
Welcome to Gradle 7.3.3!
Here are the highlights of this release:
- Easily declare new test suites in Java projects
- Support for Java 17
- Support for Scala 3
For more details see https://docs.gradle.org/7.3.3/release-notes.html
To honour the JVM settings for this build a single-use Daemon process will be forked. See https://docs.gradle.org/7.3.3/userguide/gradle_daemon.html#sec:disabling_the_daemon.
Daemon will be stopped at the end of the build
> Task :react-native-gradle-plugin:compileKotlin
'compileJava' task (current target is 1.8) and 'compileKotlin' task (current target is 11) jvm target compatibility should be set to the same Java version.
> Task :react-native-gradle-plugin:pluginDescriptors
> Task :react-native-gradle-plugin:processResources
> Task :react-native-gradle-plugin:compileKotlin
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (10, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (119, 30): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (135, 26): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (157, 32): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (163, 31): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (171, 36): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactPlugin.kt: (100, 48): 'reactRoot: DirectoryProperty' is deprecated. reactRoot was confusing and has been replace with root to point to your root project and reactNativeDir to point to the folder of the react-native NPM package
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (10, 37): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (11, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (12, 37): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (27, 51): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (130, 12): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (131, 12): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (251, 14): 'BaseVariant' is deprecated. Deprecated in Java
> Task :react-native-gradle-plugin:compileJava
[stderr] Note: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/java/com/facebook/react/codegen/generator/SchemaJsonParser.java uses or overrides a deprecated API.
[stderr] Note: Recompile with -Xlint:deprecation for details.
> Task :react-native-gradle-plugin:classes
> Task :react-native-gradle-plugin:inspectClassesForKotlinIC
> Task :react-native-gradle-plugin:jar
> Configure project :expo-application
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-background-fetch
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-constants
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-client
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-launcher
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu-interface
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-error-recovery
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-file-system
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-font
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-json-utils
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-keep-awake
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-manifests
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-modules-core
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
Checking the license for package NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/licenses
License for package NDK (Side by side) 21.4.7075529 accepted.
Preparing "Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)".
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" ready.
Installing NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/ndk/21.4.7075529
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" complete.
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" finished.
Checking the license for package CMake 3.18.1 in /home/expo/Android/Sdk/licenses
License for package CMake 3.18.1 accepted.
Preparing "Install CMake 3.18.1 (revision: 3.18.1)".
"Install CMake 3.18.1 (revision: 3.18.1)" ready.
Installing CMake 3.18.1 in /home/expo/Android/Sdk/cmake/3.18.1
"Install CMake 3.18.1 (revision: 3.18.1)" complete.
"Install CMake 3.18.1 (revision: 3.18.1)" finished.
> Configure project :expo-splash-screen
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-task-manager
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-updates-interface
Warning: The 'kotlin-android-extensions' Gradle plugin is deprecated. Please use this migration guide (https://goo.gle/kotlin-android-extensions-deprecation) to start working with View Binding (https://developer.android.com/topic/libraries/view-binding) and the 'kotlin-parcelize' plugin.
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :unimodules-app-loader
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo
Using expo modules
- expo-application (4.2.2)
- expo-background-fetch (10.3.0)
- expo-constants (13.2.4)
- expo-dev-client (1.3.0)
- expo-dev-launcher (1.3.0)
- expo-dev-menu (1.3.0)
- expo-error-recovery (3.2.0)
- expo-file-system (14.1.0)
- expo-font (10.2.1)
- expo-json-utils (0.3.0)
- expo-keep-awake (10.2.0)
- expo-manifests (0.3.1)
- expo-modules-core (0.11.5)
- expo-splash-screen (0.16.2)
- expo-task-manager (10.3.0)
- unimodules-app-loader (3.1.0)
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :notifee_react-native
:notifee_react-native @notifee/react-native found at /home/expo/workingdir/build/node_modules/@notifee/react-native
:notifee_react-native package.json found at /home/expo/workingdir/build/node_modules/@notifee/react-native/package.json
:notifee_react-native:version set from package.json: 7.0.1 (7,0,1 - 7000001)
:notifee_react-native:android.compileSdk using custom value: 46
:notifee_react-native:android.targetSdk using custom value: 31
:notifee_react-native:android.minSdk using custom value: 21
:notifee_react-native:reactNativeAndroidDir /home/expo/workingdir/build/node_modules/react-native/android
WARNING:We recommend using a newer Android Gradle plugin to use compileSdk = 46
This Android Gradle plugin (7.1.1) was tested up to compileSdk = 32
This warning can be suppressed by adding
android.suppressUnsupportedCompileSdk=46
to this project's gradle.properties
The build will continue, but you are strongly encouraged to update your project to
use a newer Android Gradle Plugin that has been tested with compileSdk = 46
Checking the license for package Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/licenses
License for package Android SDK Build-Tools 30.0.3 accepted.
Preparing "Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)".
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" ready.
Installing Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/build-tools/30.0.3
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" complete.
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" finished.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
[stderr] FAILURE: Build failed with an exception.
[stderr] * What went wrong:
[stderr] Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'.
[stderr] > Failed to find Platform SDK with path: platforms;android-46
[stderr] * Try:
[stderr] > Run with --stacktrace option to get the stack trace.
[stderr] > Run with --info or --debug option to get more log output.
[stderr] > Run with --scan to get full insights.
[stderr] * Get more help at https://help.gradle.org
See https://docs.gradle.org/7.3.3/userguide/command_line_interface.html#sec:command_line_warnings
6 actionable tasks: 6 executed
[stderr] BUILD FAILED in 3m 44s
Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.
```
| 1.0 | Create development build using EAS with Notifee - Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'. - ### Summary
Create development build using EAS.
I've followed the [ Notifee Expo support workflow ](https://notifee.app/react-native/docs/installation#expo-support) running
```
expo prebuild
eas build --profile development --platform android
```
with the latest the `expo-dev-client` package, `expo-cli`, `eas-cli` and Java 11
the build fails in run gradlew as follows:
```
Running 'gradlew :app:assembleDebug' in /home/expo/workingdir/build/android
Downloading https://services.gradle.org/distributions/gradle-7.3.3-all.zip
10%
20%.
30%
40%.
50%.
60
%.
70%
80%
90%.
100%
Welcome to Gradle 7.3.3!
Here are the highlights of this release:
- Easily declare new test suites in Java projects
- Support for Java 17
- Support for Scala 3
For more details see https://docs.gradle.org/7.3.3/release-notes.html
To honour the JVM settings for this build a single-use Daemon process will be forked. See https://docs.gradle.org/7.3.3/userguide/gradle_daemon.html#sec:disabling_the_daemon.
Daemon will be stopped at the end of the build
> Task :react-native-gradle-plugin:compileKotlin
'compileJava' task (current target is 1.8) and 'compileKotlin' task (current target is 11) jvm target compatibility should be set to the same Java version.
> Task :react-native-gradle-plugin:pluginDescriptors
> Task :react-native-gradle-plugin:processResources
> Task :react-native-gradle-plugin:compileKotlin
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (10, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (119, 30): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (135, 26): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (157, 32): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (163, 31): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (171, 36): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactPlugin.kt: (100, 48): 'reactRoot: DirectoryProperty' is deprecated. reactRoot was confusing and has been replace with root to point to your root project and reactNativeDir to point to the folder of the react-native NPM package
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (10, 37): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (11, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (12, 37): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (27, 51): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (130, 12): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (131, 12): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (251, 14): 'BaseVariant' is deprecated. Deprecated in Java
> Task :react-native-gradle-plugin:compileJava
[stderr] Note: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/java/com/facebook/react/codegen/generator/SchemaJsonParser.java uses or overrides a deprecated API.
[stderr] Note: Recompile with -Xlint:deprecation for details.
> Task :react-native-gradle-plugin:classes
> Task :react-native-gradle-plugin:inspectClassesForKotlinIC
> Task :react-native-gradle-plugin:jar
> Configure project :expo-application
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-background-fetch
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-constants
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-client
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-launcher
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu-interface
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-error-recovery
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-file-system
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-font
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-json-utils
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-keep-awake
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-manifests
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-modules-core
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
Checking the license for package NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/licenses
License for package NDK (Side by side) 21.4.7075529 accepted.
Preparing "Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)".
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" ready.
Installing NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/ndk/21.4.7075529
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" complete.
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" finished.
Checking the license for package CMake 3.18.1 in /home/expo/Android/Sdk/licenses
License for package CMake 3.18.1 accepted.
Preparing "Install CMake 3.18.1 (revision: 3.18.1)".
"Install CMake 3.18.1 (revision: 3.18.1)" ready.
Installing CMake 3.18.1 in /home/expo/Android/Sdk/cmake/3.18.1
"Install CMake 3.18.1 (revision: 3.18.1)" complete.
"Install CMake 3.18.1 (revision: 3.18.1)" finished.
> Configure project :expo-splash-screen
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-task-manager
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-updates-interface
Warning: The 'kotlin-android-extensions' Gradle plugin is deprecated. Please use this migration guide (https://goo.gle/kotlin-android-extensions-deprecation) to start working with View Binding (https://developer.android.com/topic/libraries/view-binding) and the 'kotlin-parcelize' plugin.
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :unimodules-app-loader
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo
Using expo modules
- expo-application (4.2.2)
- expo-background-fetch (10.3.0)
- expo-constants (13.2.4)
- expo-dev-client (1.3.0)
- expo-dev-launcher (1.3.0)
- expo-dev-menu (1.3.0)
- expo-error-recovery (3.2.0)
- expo-file-system (14.1.0)
- expo-font (10.2.1)
- expo-json-utils (0.3.0)
- expo-keep-awake (10.2.0)
- expo-manifests (0.3.1)
- expo-modules-core (0.11.5)
- expo-splash-screen (0.16.2)
- expo-task-manager (10.3.0)
- unimodules-app-loader (3.1.0)
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :notifee_react-native
:notifee_react-native @notifee/react-native found at /home/expo/workingdir/build/node_modules/@notifee/react-native
:notifee_react-native package.json found at /home/expo/workingdir/build/node_modules/@notifee/react-native/package.json
:notifee_react-native:version set from package.json: 7.0.1 (7,0,1 - 7000001)
:notifee_react-native:android.compileSdk using custom value: 46
:notifee_react-native:android.targetSdk using custom value: 31
:notifee_react-native:android.minSdk using custom value: 21
:notifee_react-native:reactNativeAndroidDir /home/expo/workingdir/build/node_modules/react-native/android
WARNING:We recommend using a newer Android Gradle plugin to use compileSdk = 46
This Android Gradle plugin (7.1.1) was tested up to compileSdk = 32
This warning can be suppressed by adding
android.suppressUnsupportedCompileSdk=46
to this project's gradle.properties
The build will continue, but you are strongly encouraged to update your project to
use a newer Android Gradle Plugin that has been tested with compileSdk = 46
Checking the license for package Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/licenses
License for package Android SDK Build-Tools 30.0.3 accepted.
Preparing "Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)".
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" ready.
Installing Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/build-tools/30.0.3
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" complete.
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" finished.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
[stderr] FAILURE: Build failed with an exception.
[stderr] * What went wrong:
[stderr] Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'.
[stderr] > Failed to find Platform SDK with path: platforms;android-46
[stderr] * Try:
[stderr] > Run with --stacktrace option to get the stack trace.
[stderr] > Run with --info or --debug option to get more log output.
[stderr] > Run with --scan to get full insights.
[stderr] * Get more help at https://help.gradle.org
See https://docs.gradle.org/7.3.3/userguide/command_line_interface.html#sec:command_line_warnings
6 actionable tasks: 6 executed
[stderr] BUILD FAILED in 3m 44s
Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.
```
eas.json:
```
{
"cli": {
"version": ">= 2.1.0"
},
"build": {
"development": {
"developmentClient": true,
"distribution": "internal",
"android": {
"image": "ubuntu-18.04-jdk-11-ndk-r19c"
}
}
}
}
```
package.json dependencies:
```
"dependencies": {
"@notifee/react-native": "^7.0.1",
"@react-native-async-storage/async-storage": "~1.17.3",
"expo": "~46.0.13",
"expo-background-fetch": "~10.3.0",
"expo-build-properties": "~0.3.0",
"expo-dev-client": "~1.3.0",
"expo-splash-screen": "~0.16.2",
"expo-status-bar": "~1.4.0",
"react": "18.0.0",
"react-dom": "18.0.0",
"react-native": "0.69.6",
"react-native-web": "~0.18.7"
},
```
app.json pluggins:
```
"plugins": [
"@notifee/react-native",
["expo-build-properties",
{
"android": {
"compileSdkVersion": 46
}
}]
],
```
### Managed or bare workflow?
managed
### What platform(s) does this occur on?
Android
### Package versions
```
"dependencies":` {
"@notifee/react-native": "^7.0.1",
"@react-native-async-storage/async-storage": "~1.17.3",
"expo": "~46.0.13",
"expo-background-fetch": "~10.3.0",
"expo-build-properties": "~0.3.0",
**"expo-dev-client": "~1.3.0",**
"expo-splash-screen": "~0.16.2",
"expo-status-bar": "~1.4.0",
**"expo-updates": "^0.14.6",**
"react": "18.0.0",
"react-dom": "18.0.0",
"react-native": "0.69.6",
"react-native-web": "~0.18.7"
},
```
### Environment
```
expo-env-info 1.0.5 environment info:
System:
OS: Windows 10 10.0.19044
Binaries:
Node: 14.17.0 - C:\Program Files\nodejs\node.EXE
npm: 8.1.2 - C:\Program Files\nodejs\npm.CMD
IDEs:
Android Studio: Version 4.2.0.0 AI-202.7660.26.42.7486908
npmPackages:
expo: ~46.0.13 => 46.0.13
react: 18.0.0 => 18.0.0
react-dom: 18.0.0 => 18.0.0
react-native: 0.69.6 => 0.69.6
react-native-web: ~0.18.7 => 0.18.9
Expo Workflow: bare
```
This only says bare as i've expo prebuild'd as per notifee's how to install, however I've not changed anything in the android folder
### Reproducible demo
https://github.com/TheFunEmbargo/QuotesOfNote
### Stacktrace (if a crash is involved)
```
Running 'gradlew :app:assembleDebug' in /home/expo/workingdir/build/android
Downloading https://services.gradle.org/distributions/gradle-7.3.3-all.zip
10%
20%.
30%
40%.
50%.
60
%.
70%
80%
90%.
100%
Welcome to Gradle 7.3.3!
Here are the highlights of this release:
- Easily declare new test suites in Java projects
- Support for Java 17
- Support for Scala 3
For more details see https://docs.gradle.org/7.3.3/release-notes.html
To honour the JVM settings for this build a single-use Daemon process will be forked. See https://docs.gradle.org/7.3.3/userguide/gradle_daemon.html#sec:disabling_the_daemon.
Daemon will be stopped at the end of the build
> Task :react-native-gradle-plugin:compileKotlin
'compileJava' task (current target is 1.8) and 'compileKotlin' task (current target is 11) jvm target compatibility should be set to the same Java version.
> Task :react-native-gradle-plugin:pluginDescriptors
> Task :react-native-gradle-plugin:processResources
> Task :react-native-gradle-plugin:compileKotlin
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (10, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (119, 30): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (135, 26): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (157, 32): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (163, 31): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactExtension.kt: (171, 36): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/ReactPlugin.kt: (100, 48): 'reactRoot: DirectoryProperty' is deprecated. reactRoot was confusing and has been replace with root to point to your root project and reactNativeDir to point to the folder of the react-native NPM package
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (10, 37): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (11, 37): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (12, 37): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (27, 51): 'BaseVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (130, 12): 'ApplicationVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (131, 12): 'LibraryVariant' is deprecated. Deprecated in Java
w: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/kotlin/com/facebook/react/TaskConfiguration.kt: (251, 14): 'BaseVariant' is deprecated. Deprecated in Java
> Task :react-native-gradle-plugin:compileJava
[stderr] Note: /home/expo/workingdir/build/node_modules/react-native-gradle-plugin/src/main/java/com/facebook/react/codegen/generator/SchemaJsonParser.java uses or overrides a deprecated API.
[stderr] Note: Recompile with -Xlint:deprecation for details.
> Task :react-native-gradle-plugin:classes
> Task :react-native-gradle-plugin:inspectClassesForKotlinIC
> Task :react-native-gradle-plugin:jar
> Configure project :expo-application
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-background-fetch
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-constants
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-client
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-launcher
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-dev-menu-interface
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-error-recovery
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-file-system
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-font
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-json-utils
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-keep-awake
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-manifests
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-modules-core
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
Checking the license for package NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/licenses
License for package NDK (Side by side) 21.4.7075529 accepted.
Preparing "Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)".
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" ready.
Installing NDK (Side by side) 21.4.7075529 in /home/expo/Android/Sdk/ndk/21.4.7075529
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" complete.
"Install NDK (Side by side) 21.4.7075529 (revision: 21.4.7075529)" finished.
Checking the license for package CMake 3.18.1 in /home/expo/Android/Sdk/licenses
License for package CMake 3.18.1 accepted.
Preparing "Install CMake 3.18.1 (revision: 3.18.1)".
"Install CMake 3.18.1 (revision: 3.18.1)" ready.
Installing CMake 3.18.1 in /home/expo/Android/Sdk/cmake/3.18.1
"Install CMake 3.18.1 (revision: 3.18.1)" complete.
"Install CMake 3.18.1 (revision: 3.18.1)" finished.
> Configure project :expo-splash-screen
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-task-manager
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo-updates-interface
Warning: The 'kotlin-android-extensions' Gradle plugin is deprecated. Please use this migration guide (https://goo.gle/kotlin-android-extensions-deprecation) to start working with View Binding (https://developer.android.com/topic/libraries/view-binding) and the 'kotlin-parcelize' plugin.
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :unimodules-app-loader
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :expo
Using expo modules
- expo-application (4.2.2)
- expo-background-fetch (10.3.0)
- expo-constants (13.2.4)
- expo-dev-client (1.3.0)
- expo-dev-launcher (1.3.0)
- expo-dev-menu (1.3.0)
- expo-error-recovery (3.2.0)
- expo-file-system (14.1.0)
- expo-font (10.2.1)
- expo-json-utils (0.3.0)
- expo-keep-awake (10.2.0)
- expo-manifests (0.3.1)
- expo-modules-core (0.11.5)
- expo-splash-screen (0.16.2)
- expo-task-manager (10.3.0)
- unimodules-app-loader (3.1.0)
WARNING:Software Components will not be created automatically for Maven publishing from Android Gradle Plugin 8.0. To opt-in to the future behavior, set the Gradle property android.disableAutomaticComponentCreation=true in the `gradle.properties` file or use the new publishing DSL.
> Configure project :notifee_react-native
:notifee_react-native @notifee/react-native found at /home/expo/workingdir/build/node_modules/@notifee/react-native
:notifee_react-native package.json found at /home/expo/workingdir/build/node_modules/@notifee/react-native/package.json
:notifee_react-native:version set from package.json: 7.0.1 (7,0,1 - 7000001)
:notifee_react-native:android.compileSdk using custom value: 46
:notifee_react-native:android.targetSdk using custom value: 31
:notifee_react-native:android.minSdk using custom value: 21
:notifee_react-native:reactNativeAndroidDir /home/expo/workingdir/build/node_modules/react-native/android
WARNING:We recommend using a newer Android Gradle plugin to use compileSdk = 46
This Android Gradle plugin (7.1.1) was tested up to compileSdk = 32
This warning can be suppressed by adding
android.suppressUnsupportedCompileSdk=46
to this project's gradle.properties
The build will continue, but you are strongly encouraged to update your project to
use a newer Android Gradle Plugin that has been tested with compileSdk = 46
Checking the license for package Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/licenses
License for package Android SDK Build-Tools 30.0.3 accepted.
Preparing "Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)".
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" ready.
Installing Android SDK Build-Tools 30.0.3 in /home/expo/Android/Sdk/build-tools/30.0.3
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" complete.
"Install Android SDK Build-Tools 30.0.3 (revision: 30.0.3)" finished.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
[stderr] FAILURE: Build failed with an exception.
[stderr] * What went wrong:
[stderr] Could not determine the dependencies of task ':app:compileDebugJavaWithJavac'.
[stderr] > Failed to find Platform SDK with path: platforms;android-46
[stderr] * Try:
[stderr] > Run with --stacktrace option to get the stack trace.
[stderr] > Run with --info or --debug option to get more log output.
[stderr] > Run with --scan to get full insights.
[stderr] * Get more help at https://help.gradle.org
See https://docs.gradle.org/7.3.3/userguide/command_line_interface.html#sec:command_line_warnings
6 actionable tasks: 6 executed
[stderr] BUILD FAILED in 3m 44s
Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.
```
| non_priority | create development build using eas with notifee could not determine the dependencies of task app compiledebugjavawithjavac summary create development build using eas i ve followed the running expo prebuild eas build profile development platform android with the latest the expo dev client package expo cli eas cli and java the build fails in run gradlew as follows running gradlew app assembledebug in home expo workingdir build android downloading welcome to gradle here are the highlights of this release easily declare new test suites in java projects support for java support for scala for more details see to honour the jvm settings for this build a single use daemon process will be forked see daemon will be stopped at the end of the build task react native gradle plugin compilekotlin compilejava task current target is and compilekotlin task current target is jvm target compatibility should be set to the same java version task react native gradle plugin plugindescriptors task react native gradle plugin processresources task react native gradle plugin compilekotlin w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactplugin kt reactroot directoryproperty is deprecated reactroot was confusing and has been replace with root to point to your root project and reactnativedir to point to the folder of the react native npm package w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt applicationvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt libraryvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt applicationvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt libraryvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java task react native gradle plugin compilejava note home expo workingdir build node modules react native gradle plugin src main java com facebook react codegen generator schemajsonparser java uses or overrides a deprecated api note recompile with xlint deprecation for details task react native gradle plugin classes task react native gradle plugin inspectclassesforkotlinic task react native gradle plugin jar configure project expo application warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo background fetch warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo constants warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev client warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev launcher warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev menu warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev menu interface warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo error recovery warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo file system warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo font warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo json utils warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo keep awake warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo manifests warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo modules core warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl checking the license for package ndk side by side in home expo android sdk licenses license for package ndk side by side accepted preparing install ndk side by side revision install ndk side by side revision ready installing ndk side by side in home expo android sdk ndk install ndk side by side revision complete install ndk side by side revision finished checking the license for package cmake in home expo android sdk licenses license for package cmake accepted preparing install cmake revision install cmake revision ready installing cmake in home expo android sdk cmake install cmake revision complete install cmake revision finished configure project expo splash screen warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo task manager warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo updates interface warning the kotlin android extensions gradle plugin is deprecated please use this migration guide to start working with view binding and the kotlin parcelize plugin warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project unimodules app loader warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo using expo modules expo application expo background fetch expo constants expo dev client expo dev launcher expo dev menu expo error recovery expo file system expo font expo json utils expo keep awake expo manifests expo modules core expo splash screen expo task manager unimodules app loader warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project notifee react native notifee react native notifee react native found at home expo workingdir build node modules notifee react native notifee react native package json found at home expo workingdir build node modules notifee react native package json notifee react native version set from package json notifee react native android compilesdk using custom value notifee react native android targetsdk using custom value notifee react native android minsdk using custom value notifee react native reactnativeandroiddir home expo workingdir build node modules react native android warning we recommend using a newer android gradle plugin to use compilesdk this android gradle plugin was tested up to compilesdk this warning can be suppressed by adding android suppressunsupportedcompilesdk to this project s gradle properties the build will continue but you are strongly encouraged to update your project to use a newer android gradle plugin that has been tested with compilesdk checking the license for package android sdk build tools in home expo android sdk licenses license for package android sdk build tools accepted preparing install android sdk build tools revision install android sdk build tools revision ready installing android sdk build tools in home expo android sdk build tools install android sdk build tools revision complete install android sdk build tools revision finished deprecated gradle features were used in this build making it incompatible with gradle you can use warning mode all to show the individual deprecation warnings and determine if they come from your own scripts or plugins failure build failed with an exception what went wrong could not determine the dependencies of task app compiledebugjavawithjavac failed to find platform sdk with path platforms android try run with stacktrace option to get the stack trace run with info or debug option to get more log output run with scan to get full insights get more help at see actionable tasks executed build failed in error gradle build failed with unknown error see logs for the run gradlew phase for more information eas json cli version build development developmentclient true distribution internal android image ubuntu jdk ndk package json dependencies dependencies notifee react native react native async storage async storage expo expo background fetch expo build properties expo dev client expo splash screen expo status bar react react dom react native react native web app json pluggins plugins notifee react native expo build properties android compilesdkversion managed or bare workflow managed what platform s does this occur on android package versions dependencies notifee react native react native async storage async storage expo expo background fetch expo build properties expo dev client expo splash screen expo status bar expo updates react react dom react native react native web environment expo env info environment info system os windows binaries node c program files nodejs node exe npm c program files nodejs npm cmd ides android studio version ai npmpackages expo react react dom react native react native web expo workflow bare this only says bare as i ve expo prebuild d as per notifee s how to install however i ve not changed anything in the android folder reproducible demo stacktrace if a crash is involved running gradlew app assembledebug in home expo workingdir build android downloading welcome to gradle here are the highlights of this release easily declare new test suites in java projects support for java support for scala for more details see to honour the jvm settings for this build a single use daemon process will be forked see daemon will be stopped at the end of the build task react native gradle plugin compilekotlin compilejava task current target is and compilekotlin task current target is jvm target compatibility should be set to the same java version task react native gradle plugin plugindescriptors task react native gradle plugin processresources task react native gradle plugin compilekotlin w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactextension kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react reactplugin kt reactroot directoryproperty is deprecated reactroot was confusing and has been replace with root to point to your root project and reactnativedir to point to the folder of the react native npm package w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt applicationvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt libraryvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt applicationvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt libraryvariant is deprecated deprecated in java w home expo workingdir build node modules react native gradle plugin src main kotlin com facebook react taskconfiguration kt basevariant is deprecated deprecated in java task react native gradle plugin compilejava note home expo workingdir build node modules react native gradle plugin src main java com facebook react codegen generator schemajsonparser java uses or overrides a deprecated api note recompile with xlint deprecation for details task react native gradle plugin classes task react native gradle plugin inspectclassesforkotlinic task react native gradle plugin jar configure project expo application warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo background fetch warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo constants warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev client warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev launcher warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev menu warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo dev menu interface warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo error recovery warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo file system warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo font warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo json utils warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo keep awake warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo manifests warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo modules core warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl checking the license for package ndk side by side in home expo android sdk licenses license for package ndk side by side accepted preparing install ndk side by side revision install ndk side by side revision ready installing ndk side by side in home expo android sdk ndk install ndk side by side revision complete install ndk side by side revision finished checking the license for package cmake in home expo android sdk licenses license for package cmake accepted preparing install cmake revision install cmake revision ready installing cmake in home expo android sdk cmake install cmake revision complete install cmake revision finished configure project expo splash screen warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo task manager warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo updates interface warning the kotlin android extensions gradle plugin is deprecated please use this migration guide to start working with view binding and the kotlin parcelize plugin warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project unimodules app loader warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project expo using expo modules expo application expo background fetch expo constants expo dev client expo dev launcher expo dev menu expo error recovery expo file system expo font expo json utils expo keep awake expo manifests expo modules core expo splash screen expo task manager unimodules app loader warning software components will not be created automatically for maven publishing from android gradle plugin to opt in to the future behavior set the gradle property android disableautomaticcomponentcreation true in the gradle properties file or use the new publishing dsl configure project notifee react native notifee react native notifee react native found at home expo workingdir build node modules notifee react native notifee react native package json found at home expo workingdir build node modules notifee react native package json notifee react native version set from package json notifee react native android compilesdk using custom value notifee react native android targetsdk using custom value notifee react native android minsdk using custom value notifee react native reactnativeandroiddir home expo workingdir build node modules react native android warning we recommend using a newer android gradle plugin to use compilesdk this android gradle plugin was tested up to compilesdk this warning can be suppressed by adding android suppressunsupportedcompilesdk to this project s gradle properties the build will continue but you are strongly encouraged to update your project to use a newer android gradle plugin that has been tested with compilesdk checking the license for package android sdk build tools in home expo android sdk licenses license for package android sdk build tools accepted preparing install android sdk build tools revision install android sdk build tools revision ready installing android sdk build tools in home expo android sdk build tools install android sdk build tools revision complete install android sdk build tools revision finished deprecated gradle features were used in this build making it incompatible with gradle you can use warning mode all to show the individual deprecation warnings and determine if they come from your own scripts or plugins failure build failed with an exception what went wrong could not determine the dependencies of task app compiledebugjavawithjavac failed to find platform sdk with path platforms android try run with stacktrace option to get the stack trace run with info or debug option to get more log output run with scan to get full insights get more help at see actionable tasks executed build failed in error gradle build failed with unknown error see logs for the run gradlew phase for more information | 0 |
610,722 | 18,922,574,887 | IssuesEvent | 2021-11-17 04:47:30 | CMPUT301F21T20/HabitTracker | https://api.github.com/repos/CMPUT301F21T20/HabitTracker | closed | 6.1 Habit Event Location | priority: medium Habit Events Geolocation and Maps complexity: high | User Story: As a doer, I want a habit event to have an optional location to record where it happened.
When a user creates a habit event we need to give them the option to save their location. This requires asking for location permission and saving the location along with the other habit event info in Firestore.
This feature requires interacting with Android Location API and Firestore making it highly complex. Despite this, the feature is not essential to have a fully working app so it is a medium priority.
Story Points: 4
| 1.0 | 6.1 Habit Event Location - User Story: As a doer, I want a habit event to have an optional location to record where it happened.
When a user creates a habit event we need to give them the option to save their location. This requires asking for location permission and saving the location along with the other habit event info in Firestore.
This feature requires interacting with Android Location API and Firestore making it highly complex. Despite this, the feature is not essential to have a fully working app so it is a medium priority.
Story Points: 4
| priority | habit event location user story as a doer i want a habit event to have an optional location to record where it happened when a user creates a habit event we need to give them the option to save their location this requires asking for location permission and saving the location along with the other habit event info in firestore this feature requires interacting with android location api and firestore making it highly complex despite this the feature is not essential to have a fully working app so it is a medium priority story points | 1 |
9,235 | 27,776,174,150 | IssuesEvent | 2023-03-16 17:22:24 | gchq/gaffer-docker | https://api.github.com/repos/gchq/gaffer-docker | opened | Update image tagging to work with new versioning | Docker automation | The changes in #275 add the Accumulo version to the version tag for certain images. The GitHub Actions workflow requires updating to work correctly with this new tag format.
This is also a good opportunity to upgrade the workflows to make greater use of Actions instead of scripts and to also push images to GHCR. | 1.0 | Update image tagging to work with new versioning - The changes in #275 add the Accumulo version to the version tag for certain images. The GitHub Actions workflow requires updating to work correctly with this new tag format.
This is also a good opportunity to upgrade the workflows to make greater use of Actions instead of scripts and to also push images to GHCR. | non_priority | update image tagging to work with new versioning the changes in add the accumulo version to the version tag for certain images the github actions workflow requires updating to work correctly with this new tag format this is also a good opportunity to upgrade the workflows to make greater use of actions instead of scripts and to also push images to ghcr | 0 |
257,355 | 8,136,301,283 | IssuesEvent | 2018-08-20 07:56:31 | ow2-proactive/scheduling-portal | https://api.github.com/repos/ow2-proactive/scheduling-portal | opened | When a user doesn't have the right to execute scripts from the Resource Manager portal, a Http 500 error is reported | priority:low severity:minor type:bug | When a user doesn't have the right to execute scripts from the RM Portal, the portal reports a Http 500 error. This error message is confusing. Better replace it by a message like "You are not authorised to execute scripts on this node, please contact the node's administrator." | 1.0 | When a user doesn't have the right to execute scripts from the Resource Manager portal, a Http 500 error is reported - When a user doesn't have the right to execute scripts from the RM Portal, the portal reports a Http 500 error. This error message is confusing. Better replace it by a message like "You are not authorised to execute scripts on this node, please contact the node's administrator." | priority | when a user doesn t have the right to execute scripts from the resource manager portal a http error is reported when a user doesn t have the right to execute scripts from the rm portal the portal reports a http error this error message is confusing better replace it by a message like you are not authorised to execute scripts on this node please contact the node s administrator | 1 |
553,064 | 16,342,922,199 | IssuesEvent | 2021-05-13 01:27:40 | knative/eventing | https://api.github.com/repos/knative/eventing | closed | Make dynamically created resources configurable | kind/feature-request lifecycle/stale priority/important-soon | **Problem**
Most of the resources created dynamically are not configurable. For instance see how the [PingSource receive adapter](https://github.com/knative/eventing/blob/master/pkg/reconciler/pingsource/resources/mt_receive_adapter.go#L49) is created.
**[Persona:](https://github.com/knative/eventing/blob/master/docs/personas.md)**
Which persona is this feature for?
Operator
**Exit Criteria**
A measurable (binary) test that would indicate that the problem has been resolved.
Receive adapter deployment or ksvc with extensions
**Time Estimate (optional):**
How many developer-days do you think this may take to resolve?
2-3
**Additional context (optional)**
The current solution relies on both `DeepDerivative` and an external way to dynamically patch resources. That's less than ideal. Not all controllers are using `DeepDerivative` (is this a bug?) and not all fields can be updated dynamically. Also what is "the external way"? We certainly need something more self-contained.
Another solution is to do what the alt Kafka channel implementation is doing: using [env vars](https://github.com/knative-sandbox/eventing-kafka/blob/92b3fc674c941409e716f1f4c62dbfd9a33578d8/config/400-deployment.yaml) (and soon [ConfigMap](https://github.com/knative-sandbox/eventing-kafka/blob/d7cd8aa44bd9adc6c0ddafaf0c1e04e9ba2ff0c8/config/200-eventing-kafka-configmap.yaml)) to specify the configuration parameters. The major (IMO) drawback of this solution is 1) it is not generic 2) there is always the risk of forgetting something. For instance there is no way to add `affinity` rules in there without modifying the code.
A third solution is specify the expected deployment or ksvc in a config map, similar to what we already do with the [default channel template](https://github.com/knative/eventing/blob/master/config/core/configmaps/default-broker-channel.yaml).
For instance `config-eventing.yaml` could have the following configuration:
```
apiVersion: v1
kind: ConfigMap
metadata:
name: config-eventing
namespace: knative-eventing
data:
mt-pingsource-adapter: |
apiVersion: apps/v1
kind: Deployment
metadata:
name: pingsource-mt-adapter
spec:
template:
spec:
containers:
- name: dispatcher
resources:
limits:
cpu: "1"
memory: 2Gi
requests:
cpu: 125m
memory: 64Mi
```
The eventing webhook must first validate the specified fields that are under its control (e.g. `apiVersion`, `kind`, `name`, `namespace`, `spec.template.spec.containers[0].name`). If `image` is specified, the webhook should reject the configmap.
The eventing controller would then augment the resource with the fields it "owns", like `serviceAccountName`, `image`, and so on.
The exact behavior is yet to be defined. I'm also planning to create a feature track document.
Comments? @cr22rc @n3wscott @eric-sap @travis-minke-sap @vaikas @grantr
| 1.0 | Make dynamically created resources configurable - **Problem**
Most of the resources created dynamically are not configurable. For instance see how the [PingSource receive adapter](https://github.com/knative/eventing/blob/master/pkg/reconciler/pingsource/resources/mt_receive_adapter.go#L49) is created.
**[Persona:](https://github.com/knative/eventing/blob/master/docs/personas.md)**
Which persona is this feature for?
Operator
**Exit Criteria**
A measurable (binary) test that would indicate that the problem has been resolved.
Receive adapter deployment or ksvc with extensions
**Time Estimate (optional):**
How many developer-days do you think this may take to resolve?
2-3
**Additional context (optional)**
The current solution relies on both `DeepDerivative` and an external way to dynamically patch resources. That's less than ideal. Not all controllers are using `DeepDerivative` (is this a bug?) and not all fields can be updated dynamically. Also what is "the external way"? We certainly need something more self-contained.
Another solution is to do what the alt Kafka channel implementation is doing: using [env vars](https://github.com/knative-sandbox/eventing-kafka/blob/92b3fc674c941409e716f1f4c62dbfd9a33578d8/config/400-deployment.yaml) (and soon [ConfigMap](https://github.com/knative-sandbox/eventing-kafka/blob/d7cd8aa44bd9adc6c0ddafaf0c1e04e9ba2ff0c8/config/200-eventing-kafka-configmap.yaml)) to specify the configuration parameters. The major (IMO) drawback of this solution is 1) it is not generic 2) there is always the risk of forgetting something. For instance there is no way to add `affinity` rules in there without modifying the code.
A third solution is specify the expected deployment or ksvc in a config map, similar to what we already do with the [default channel template](https://github.com/knative/eventing/blob/master/config/core/configmaps/default-broker-channel.yaml).
For instance `config-eventing.yaml` could have the following configuration:
```
apiVersion: v1
kind: ConfigMap
metadata:
name: config-eventing
namespace: knative-eventing
data:
mt-pingsource-adapter: |
apiVersion: apps/v1
kind: Deployment
metadata:
name: pingsource-mt-adapter
spec:
template:
spec:
containers:
- name: dispatcher
resources:
limits:
cpu: "1"
memory: 2Gi
requests:
cpu: 125m
memory: 64Mi
```
The eventing webhook must first validate the specified fields that are under its control (e.g. `apiVersion`, `kind`, `name`, `namespace`, `spec.template.spec.containers[0].name`). If `image` is specified, the webhook should reject the configmap.
The eventing controller would then augment the resource with the fields it "owns", like `serviceAccountName`, `image`, and so on.
The exact behavior is yet to be defined. I'm also planning to create a feature track document.
Comments? @cr22rc @n3wscott @eric-sap @travis-minke-sap @vaikas @grantr
| priority | make dynamically created resources configurable problem most of the resources created dynamically are not configurable for instance see how the is created which persona is this feature for operator exit criteria a measurable binary test that would indicate that the problem has been resolved receive adapter deployment or ksvc with extensions time estimate optional how many developer days do you think this may take to resolve additional context optional the current solution relies on both deepderivative and an external way to dynamically patch resources that s less than ideal not all controllers are using deepderivative is this a bug and not all fields can be updated dynamically also what is the external way we certainly need something more self contained another solution is to do what the alt kafka channel implementation is doing using and soon to specify the configuration parameters the major imo drawback of this solution is it is not generic there is always the risk of forgetting something for instance there is no way to add affinity rules in there without modifying the code a third solution is specify the expected deployment or ksvc in a config map similar to what we already do with the for instance config eventing yaml could have the following configuration apiversion kind configmap metadata name config eventing namespace knative eventing data mt pingsource adapter apiversion apps kind deployment metadata name pingsource mt adapter spec template spec containers name dispatcher resources limits cpu memory requests cpu memory the eventing webhook must first validate the specified fields that are under its control e g apiversion kind name namespace spec template spec containers name if image is specified the webhook should reject the configmap the eventing controller would then augment the resource with the fields it owns like serviceaccountname image and so on the exact behavior is yet to be defined i m also planning to create a feature track document comments eric sap travis minke sap vaikas grantr | 1 |
46,026 | 13,055,840,532 | IssuesEvent | 2020-07-30 02:53:37 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | const-correctness in icetray (Trac #459) | Incomplete Migration Migrated from Trac combo reconstruction defect | Migrated from https://code.icecube.wisc.edu/ticket/459
```json
{
"status": "closed",
"changetime": "2013-03-20T21:11:36",
"description": "I found small bugs related to const methods. These are trivially corrected by adding a 'const' keyword.\n\nin icetray/private/icetray/PythonModule.h:\n\n\n{{{\n- const I3Context& GetContext() { return Base::context_; }\n- const I3Configuration& GetConfiguration() { return Base::configuration_; }\n+ const I3Context& GetContext() const { return Base::context_; }\n+ const I3Configuration& GetConfiguration() const { return Base::configuration_; }\n}}}\n\n\nin icetray/public/icetray/I3ServiceBase.h\n\n{{{\n- const I3Configuration& GetConfiguration() { return *configuration_; }\n- const I3Context& GetContext() { return context_; }\n+ const I3Configuration& GetConfiguration() const { return *configuration_; }\n+ const I3Context& GetContext() const { return context_; }\n}}}\n\n\nOne can get around the problem in I3ServiceBase by doing a const_cast, but that is not so with PythonModule, I believe.\n\nWould you need a test where these things cause a failure?",
"reporter": "jgonzalez",
"cc": "",
"resolution": "fixed",
"_ts": "1363813896000000",
"component": "combo reconstruction",
"summary": "const-correctness in icetray",
"priority": "normal",
"keywords": "",
"time": "2013-03-18T15:16:26",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
| 1.0 | const-correctness in icetray (Trac #459) - Migrated from https://code.icecube.wisc.edu/ticket/459
```json
{
"status": "closed",
"changetime": "2013-03-20T21:11:36",
"description": "I found small bugs related to const methods. These are trivially corrected by adding a 'const' keyword.\n\nin icetray/private/icetray/PythonModule.h:\n\n\n{{{\n- const I3Context& GetContext() { return Base::context_; }\n- const I3Configuration& GetConfiguration() { return Base::configuration_; }\n+ const I3Context& GetContext() const { return Base::context_; }\n+ const I3Configuration& GetConfiguration() const { return Base::configuration_; }\n}}}\n\n\nin icetray/public/icetray/I3ServiceBase.h\n\n{{{\n- const I3Configuration& GetConfiguration() { return *configuration_; }\n- const I3Context& GetContext() { return context_; }\n+ const I3Configuration& GetConfiguration() const { return *configuration_; }\n+ const I3Context& GetContext() const { return context_; }\n}}}\n\n\nOne can get around the problem in I3ServiceBase by doing a const_cast, but that is not so with PythonModule, I believe.\n\nWould you need a test where these things cause a failure?",
"reporter": "jgonzalez",
"cc": "",
"resolution": "fixed",
"_ts": "1363813896000000",
"component": "combo reconstruction",
"summary": "const-correctness in icetray",
"priority": "normal",
"keywords": "",
"time": "2013-03-18T15:16:26",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
| non_priority | const correctness in icetray trac migrated from json status closed changetime description i found small bugs related to const methods these are trivially corrected by adding a const keyword n nin icetray private icetray pythonmodule h n n n n const getcontext return base context n const getconfiguration return base configuration n const getcontext const return base context n const getconfiguration const return base configuration n n n nin icetray public icetray h n n n const getconfiguration return configuration n const getcontext return context n const getconfiguration const return configuration n const getcontext const return context n n n none can get around the problem in by doing a const cast but that is not so with pythonmodule i believe n nwould you need a test where these things cause a failure reporter jgonzalez cc resolution fixed ts component combo reconstruction summary const correctness in icetray priority normal keywords time milestone owner blaufuss type defect | 0 |
585,703 | 17,515,238,334 | IssuesEvent | 2021-08-11 05:31:50 | GIST-Petition-Site-Project/GIST-petition-web | https://api.github.com/repos/GIST-Petition-Site-Project/GIST-petition-web | opened | 회원가입 페이지 코드 리팩토링 및 실사용자를 위한 UI 개선 | Type: Feature/UI Type: Feature/Function Status: To Do Priority: Medium | ## Feature description
- state를 개별적으로 선언하지 않고 객체로 만들어 선언
- 현재 테스트 사용자를 위한 UI로 구성되어 있지만 후에 이메일 인증 API 구현 시 실사용자를 대상으로 UI 개선 요망
### Use cases
## Benefits
For whom and why.
## Requirements
- pages/SignUp.js
## Links / references
| 1.0 | 회원가입 페이지 코드 리팩토링 및 실사용자를 위한 UI 개선 - ## Feature description
- state를 개별적으로 선언하지 않고 객체로 만들어 선언
- 현재 테스트 사용자를 위한 UI로 구성되어 있지만 후에 이메일 인증 API 구현 시 실사용자를 대상으로 UI 개선 요망
### Use cases
## Benefits
For whom and why.
## Requirements
- pages/SignUp.js
## Links / references
| priority | 회원가입 페이지 코드 리팩토링 및 실사용자를 위한 ui 개선 feature description state를 개별적으로 선언하지 않고 객체로 만들어 선언 현재 테스트 사용자를 위한 ui로 구성되어 있지만 후에 이메일 인증 api 구현 시 실사용자를 대상으로 ui 개선 요망 use cases benefits for whom and why requirements pages signup js links references | 1 |
54,332 | 3,066,352,687 | IssuesEvent | 2015-08-18 00:46:55 | theminted/lesswrong-migrated | https://api.github.com/repos/theminted/lesswrong-migrated | opened | WebApp Error: <type 'exceptions.AttributeError'>: _id not found | bug imported Priority-Medium | _From [wjmo...@gmail.com](https://code.google.com/u/117567618910921056910/) on October 22, 2013 09:17:03_
Error occurred accessing http://lesswrong.com/comments/ See attached exception email.
**Attachment:** [_id not found.pdf](http://code.google.com/p/lesswrong/issues/detail?id=408)
_Original issue: http://code.google.com/p/lesswrong/issues/detail?id=408_ | 1.0 | WebApp Error: <type 'exceptions.AttributeError'>: _id not found - _From [wjmo...@gmail.com](https://code.google.com/u/117567618910921056910/) on October 22, 2013 09:17:03_
Error occurred accessing http://lesswrong.com/comments/ See attached exception email.
**Attachment:** [_id not found.pdf](http://code.google.com/p/lesswrong/issues/detail?id=408)
_Original issue: http://code.google.com/p/lesswrong/issues/detail?id=408_ | priority | webapp error id not found from on october error occurred accessing see attached exception email attachment original issue | 1 |
542,779 | 15,866,617,088 | IssuesEvent | 2021-04-08 15:56:33 | ESCOMP/CTSM | https://api.github.com/repos/ESCOMP/CTSM | closed | external munging with cdeps and fox | priority: low tag: next type: -external type: enhancement | ### Brief summary of bug
The following generates an unclean state in components/cdeps that I'm not quite sure how to clean out:
```
git clone git@github.com:ESCOMP/CTSM.git ctsm-test-cdeps
cd ctsm-test-cdeps/
git remote add ckoven_repo git@github.com:ckoven/CTSM.git
git fetch ckoven_repo
./manage_externals/checkout_externals
git checkout -b snow_occlusion_ctsm ckoven_repo/snow_occlusion_ctsm
./manage_externals/checkout_externals
git checkout master
./manage_externals/checkout_externals
```
generates this message:
```
cheyenne4 rgknox/ctsm-test-cdeps> ./manage_externals/checkout_externals
Processing externals description file : Externals.cfg
Processing externals description file : Externals_CLM.cfg
Processing externals description file : Externals_CISM.cfg
Processing externals description file : .gitmodules
Processing submodules description file : .gitmodules
Processing externals description file : .gitmodules
Processing submodules description file : .gitmodules
Checking status of externals: clm, fates, ptclm, mosart, mizuroute, cime, rtm, cism, source_cism, cdeps, fox, cmeps, nems/lib/parallelio, nems/lib/genf90, doc-builder,
s ./cime
s ./cime/src/drivers/nuopc/
./cime/src/drivers/nuopc/nems/lib/ParallelIO
./cime/src/drivers/nuopc/nems/lib/genf90
sM ./components/cdeps
./components/cdeps/fox
s ./components/cism
./components/cism/source_cism
./components/mizuRoute
s ./components/mosart
s ./components/rtm
e-o ./doc/doc-builder
./src/fates
./tools/PTCLM
----------------------------------------------------------------------
The external repositories labeled with 'M' above are not in a clean state.
The following are three options for how to proceed:
(1) Go into each external that is not in a clean state and issue either a 'git status' or
an 'svn status' command (depending on whether the external is managed by git or
svn). Either revert or commit your changes so that all externals are in a clean
state. (To revert changes in git, follow the instructions given when you run 'git
status'.) (Note, though, that it is okay to have untracked files in your working
directory.) Then rerun checkout_externals.
(2) Alternatively, you do not have to rely on checkout_externals. Instead, you can manually
update out-of-sync externals (labeled with 's' above) as described in the
configuration file Externals.cfg. (For example, run 'git fetch' and 'git checkout'
commands to checkout the appropriate tags for each external, as given in
Externals.cfg.)
(3) You can also use checkout_externals to manage most, but not all externals: You can specify
one or more externals to ignore using the '-x' or '--exclude' argument to
checkout_externals. Excluding externals labeled with 'M' will allow checkout_externals to
update the other, non-excluded externals.
The external repositories labeled with '?' above are not under version
control using the expected protocol. If you are sure you want to switch
protocols, and you don't have any work you need to save from this
directory, then run "rm -rf [directory]" before re-running the
checkout_externals tool.
----------------------------------------------------------------------
```
Here is the status of cdeps:
```
cheyenne4 components/cdeps> git status
HEAD detached at 45b7a85
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git checkout -- <file>..." to discard changes in working directory)
modified: fox (new commits)
Untracked files:
(use "git add <file>..." to include in what will be committed)
share/genf90/
no changes added to commit (use "git add" and/or "git commit -a")
```
I step into fox and here is the status:
```
cheyenne4 cdeps/fox> git status
HEAD detached at 4ff17b4
nothing to commit, working tree clean
```
| 1.0 | external munging with cdeps and fox - ### Brief summary of bug
The following generates an unclean state in components/cdeps that I'm not quite sure how to clean out:
```
git clone git@github.com:ESCOMP/CTSM.git ctsm-test-cdeps
cd ctsm-test-cdeps/
git remote add ckoven_repo git@github.com:ckoven/CTSM.git
git fetch ckoven_repo
./manage_externals/checkout_externals
git checkout -b snow_occlusion_ctsm ckoven_repo/snow_occlusion_ctsm
./manage_externals/checkout_externals
git checkout master
./manage_externals/checkout_externals
```
generates this message:
```
cheyenne4 rgknox/ctsm-test-cdeps> ./manage_externals/checkout_externals
Processing externals description file : Externals.cfg
Processing externals description file : Externals_CLM.cfg
Processing externals description file : Externals_CISM.cfg
Processing externals description file : .gitmodules
Processing submodules description file : .gitmodules
Processing externals description file : .gitmodules
Processing submodules description file : .gitmodules
Checking status of externals: clm, fates, ptclm, mosart, mizuroute, cime, rtm, cism, source_cism, cdeps, fox, cmeps, nems/lib/parallelio, nems/lib/genf90, doc-builder,
s ./cime
s ./cime/src/drivers/nuopc/
./cime/src/drivers/nuopc/nems/lib/ParallelIO
./cime/src/drivers/nuopc/nems/lib/genf90
sM ./components/cdeps
./components/cdeps/fox
s ./components/cism
./components/cism/source_cism
./components/mizuRoute
s ./components/mosart
s ./components/rtm
e-o ./doc/doc-builder
./src/fates
./tools/PTCLM
----------------------------------------------------------------------
The external repositories labeled with 'M' above are not in a clean state.
The following are three options for how to proceed:
(1) Go into each external that is not in a clean state and issue either a 'git status' or
an 'svn status' command (depending on whether the external is managed by git or
svn). Either revert or commit your changes so that all externals are in a clean
state. (To revert changes in git, follow the instructions given when you run 'git
status'.) (Note, though, that it is okay to have untracked files in your working
directory.) Then rerun checkout_externals.
(2) Alternatively, you do not have to rely on checkout_externals. Instead, you can manually
update out-of-sync externals (labeled with 's' above) as described in the
configuration file Externals.cfg. (For example, run 'git fetch' and 'git checkout'
commands to checkout the appropriate tags for each external, as given in
Externals.cfg.)
(3) You can also use checkout_externals to manage most, but not all externals: You can specify
one or more externals to ignore using the '-x' or '--exclude' argument to
checkout_externals. Excluding externals labeled with 'M' will allow checkout_externals to
update the other, non-excluded externals.
The external repositories labeled with '?' above are not under version
control using the expected protocol. If you are sure you want to switch
protocols, and you don't have any work you need to save from this
directory, then run "rm -rf [directory]" before re-running the
checkout_externals tool.
----------------------------------------------------------------------
```
Here is the status of cdeps:
```
cheyenne4 components/cdeps> git status
HEAD detached at 45b7a85
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git checkout -- <file>..." to discard changes in working directory)
modified: fox (new commits)
Untracked files:
(use "git add <file>..." to include in what will be committed)
share/genf90/
no changes added to commit (use "git add" and/or "git commit -a")
```
I step into fox and here is the status:
```
cheyenne4 cdeps/fox> git status
HEAD detached at 4ff17b4
nothing to commit, working tree clean
```
| priority | external munging with cdeps and fox brief summary of bug the following generates an unclean state in components cdeps that i m not quite sure how to clean out git clone git github com escomp ctsm git ctsm test cdeps cd ctsm test cdeps git remote add ckoven repo git github com ckoven ctsm git git fetch ckoven repo manage externals checkout externals git checkout b snow occlusion ctsm ckoven repo snow occlusion ctsm manage externals checkout externals git checkout master manage externals checkout externals generates this message rgknox ctsm test cdeps manage externals checkout externals processing externals description file externals cfg processing externals description file externals clm cfg processing externals description file externals cism cfg processing externals description file gitmodules processing submodules description file gitmodules processing externals description file gitmodules processing submodules description file gitmodules checking status of externals clm fates ptclm mosart mizuroute cime rtm cism source cism cdeps fox cmeps nems lib parallelio nems lib doc builder s cime s cime src drivers nuopc cime src drivers nuopc nems lib parallelio cime src drivers nuopc nems lib sm components cdeps components cdeps fox s components cism components cism source cism components mizuroute s components mosart s components rtm e o doc doc builder src fates tools ptclm the external repositories labeled with m above are not in a clean state the following are three options for how to proceed go into each external that is not in a clean state and issue either a git status or an svn status command depending on whether the external is managed by git or svn either revert or commit your changes so that all externals are in a clean state to revert changes in git follow the instructions given when you run git status note though that it is okay to have untracked files in your working directory then rerun checkout externals alternatively you do not have to rely on checkout externals instead you can manually update out of sync externals labeled with s above as described in the configuration file externals cfg for example run git fetch and git checkout commands to checkout the appropriate tags for each external as given in externals cfg you can also use checkout externals to manage most but not all externals you can specify one or more externals to ignore using the x or exclude argument to checkout externals excluding externals labeled with m will allow checkout externals to update the other non excluded externals the external repositories labeled with above are not under version control using the expected protocol if you are sure you want to switch protocols and you don t have any work you need to save from this directory then run rm rf before re running the checkout externals tool here is the status of cdeps components cdeps git status head detached at changes not staged for commit use git add to update what will be committed use git checkout to discard changes in working directory modified fox new commits untracked files use git add to include in what will be committed share no changes added to commit use git add and or git commit a i step into fox and here is the status cdeps fox git status head detached at nothing to commit working tree clean | 1 |
175,335 | 6,549,787,806 | IssuesEvent | 2017-09-05 08:29:32 | rogerthat-platform/rogerthat-backend | https://api.github.com/repos/rogerthat-platform/rogerthat-backend | opened | Sending message to deactivated user caused Exception instead of CanOnlySendToFriendsException | priority_minor type_bug | ```
Unknown exception occurred: error id bc0bc0ac-f532-4144-8011-d9a3e873fc6d (/base/data/home/apps/e~rogerthat-server/36.403870270423677127/add_1_monkey_patches.py:111)
Traceback (most recent call last):
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/rpc/service.py", line 570, in _execute_service_api_call
r = run(f, [], parse_parameters(f, params))
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 180, in run
result = function(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/rpc/service.py", line 355, in wrapped
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/service/api/messaging.py", line 76, in send
is_mfr=users.get_current_user().is_mfr, broadcast_guid=broadcast_guid, step_id=step_id)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/bizz/messaging.py", line 848, in sendMessage
_validate_members(sender_user_possibly_with_slash_default, sender_is_service_identity, member_users)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/bizz/messaging.py", line 2901, in _validate_members
a = are_service_identity_users(members)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/dal/profile.py", line 43, in are_service_identity_users
return [isinstance(p, ServiceIdentity) for p in get_profile_infos(existing_users)]
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/dal/profile.py", line 224, in get_profile_infos
azzert(None not in r)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/properties.py", line 52, in azzert
sorted(sys._getframe(1).f_locals.iteritems()), str(error_message) + '\n Locals: '))
AssertionError:
Locals:
- allow_none_in_results: False
- cache_misses: True
- expected_types: None
- f: <function get_service_identity at 0xfafd8c30>
- i: 0
- profile_info: None
- profile_infos: {users.User(email='***@pandora.be:be-sint-truiden'): None}
- r: [None]
- remaining_profile_infos: [None]
- remaining_profile_infos_to_get: [users.User(email='***@pandora.be:be-sint-truiden')]
- result: <mcfw.consts.MissingClass object at 0xfb98bed0>
- u: greet.willems3@pandora.be:be-sint-truiden
- update_mem_cache: True
- update_request_cache: True
- user: ***@pandora.be:be-sint-truiden
- users_: [users.User(email='***@pandora.be:be-sint-truiden')]
``` | 1.0 | Sending message to deactivated user caused Exception instead of CanOnlySendToFriendsException - ```
Unknown exception occurred: error id bc0bc0ac-f532-4144-8011-d9a3e873fc6d (/base/data/home/apps/e~rogerthat-server/36.403870270423677127/add_1_monkey_patches.py:111)
Traceback (most recent call last):
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/rpc/service.py", line 570, in _execute_service_api_call
r = run(f, [], parse_parameters(f, params))
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 180, in run
result = function(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/rpc/service.py", line 355, in wrapped
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/service/api/messaging.py", line 76, in send
is_mfr=users.get_current_user().is_mfr, broadcast_guid=broadcast_guid, step_id=step_id)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/bizz/messaging.py", line 848, in sendMessage
_validate_members(sender_user_possibly_with_slash_default, sender_is_service_identity, member_users)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/bizz/messaging.py", line 2901, in _validate_members
a = are_service_identity_users(members)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/dal/profile.py", line 43, in are_service_identity_users
return [isinstance(p, ServiceIdentity) for p in get_profile_infos(existing_users)]
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 164, in typechecked_return
result = f(*args, **kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/rpc.py", line 142, in typechecked_f
return f(**kwargs)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/rogerthat/dal/profile.py", line 224, in get_profile_infos
azzert(None not in r)
File "/base/data/home/apps/e~rogerthat-server/36.403870270423677127/mcfw/properties.py", line 52, in azzert
sorted(sys._getframe(1).f_locals.iteritems()), str(error_message) + '\n Locals: '))
AssertionError:
Locals:
- allow_none_in_results: False
- cache_misses: True
- expected_types: None
- f: <function get_service_identity at 0xfafd8c30>
- i: 0
- profile_info: None
- profile_infos: {users.User(email='***@pandora.be:be-sint-truiden'): None}
- r: [None]
- remaining_profile_infos: [None]
- remaining_profile_infos_to_get: [users.User(email='***@pandora.be:be-sint-truiden')]
- result: <mcfw.consts.MissingClass object at 0xfb98bed0>
- u: greet.willems3@pandora.be:be-sint-truiden
- update_mem_cache: True
- update_request_cache: True
- user: ***@pandora.be:be-sint-truiden
- users_: [users.User(email='***@pandora.be:be-sint-truiden')]
``` | priority | sending message to deactivated user caused exception instead of canonlysendtofriendsexception unknown exception occurred error id base data home apps e rogerthat server add monkey patches py traceback most recent call last file base data home apps e rogerthat server rogerthat rpc service py line in execute service api call r run f parse parameters f params file base data home apps e rogerthat server mcfw rpc py line in run result function args kwargs file base data home apps e rogerthat server rogerthat rpc service py line in wrapped result f args kwargs file base data home apps e rogerthat server mcfw rpc py line in typechecked return result f args kwargs file base data home apps e rogerthat server mcfw rpc py line in typechecked f return f kwargs file base data home apps e rogerthat server rogerthat service api messaging py line in send is mfr users get current user is mfr broadcast guid broadcast guid step id step id file base data home apps e rogerthat server mcfw rpc py line in typechecked return result f args kwargs file base data home apps e rogerthat server mcfw rpc py line in typechecked f return f kwargs file base data home apps e rogerthat server rogerthat bizz messaging py line in sendmessage validate members sender user possibly with slash default sender is service identity member users file base data home apps e rogerthat server rogerthat bizz messaging py line in validate members a are service identity users members file base data home apps e rogerthat server mcfw rpc py line in typechecked return result f args kwargs file base data home apps e rogerthat server mcfw rpc py line in typechecked f return f kwargs file base data home apps e rogerthat server rogerthat dal profile py line in are service identity users return file base data home apps e rogerthat server mcfw rpc py line in typechecked return result f args kwargs file base data home apps e rogerthat server mcfw rpc py line in typechecked f return f kwargs file base data home apps e rogerthat server rogerthat dal profile py line in get profile infos azzert none not in r file base data home apps e rogerthat server mcfw properties py line in azzert sorted sys getframe f locals iteritems str error message n locals assertionerror locals allow none in results false cache misses true expected types none f i profile info none profile infos users user email pandora be be sint truiden none r remaining profile infos remaining profile infos to get result u greet pandora be be sint truiden update mem cache true update request cache true user pandora be be sint truiden users | 1 |
44,216 | 2,900,282,469 | IssuesEvent | 2015-06-17 15:41:30 | rasmi/civic-graph | https://api.github.com/repos/rasmi/civic-graph | closed | Only show three items by default in overflowing lists. | enhancement priority-medium | See `$scope.itemsShownDefault` in `app/static/js/app.js` to set defaults. | 1.0 | Only show three items by default in overflowing lists. - See `$scope.itemsShownDefault` in `app/static/js/app.js` to set defaults. | priority | only show three items by default in overflowing lists see scope itemsshowndefault in app static js app js to set defaults | 1 |
133,090 | 5,196,815,538 | IssuesEvent | 2017-01-23 14:02:10 | fgpv-vpgf/fgpv-vpgf | https://api.github.com/repos/fgpv-vpgf/fgpv-vpgf | closed | "Help" non responsive in Firefox | browser: Foxes bug-type: broken use case priority: high problem: bug v1.5.0 | Tested URL :http://fgpv.cloudapp.net/demo/v1.5.0-1/prod/samples/index-fgp-en.html?keys=JOSM,EcoAction,CESI_Other,NPRI_CO,Railways,mb_colour,Airports,Barley
"Help" option from the left hand side panel or from the right side panel do not work. Help window does not load in both cases. Below is the error I see in the console. Please investigate

| 1.0 | "Help" non responsive in Firefox - Tested URL :http://fgpv.cloudapp.net/demo/v1.5.0-1/prod/samples/index-fgp-en.html?keys=JOSM,EcoAction,CESI_Other,NPRI_CO,Railways,mb_colour,Airports,Barley
"Help" option from the left hand side panel or from the right side panel do not work. Help window does not load in both cases. Below is the error I see in the console. Please investigate

| priority | help non responsive in firefox tested url help option from the left hand side panel or from the right side panel do not work help window does not load in both cases below is the error i see in the console please investigate | 1 |
827,175 | 31,758,274,094 | IssuesEvent | 2023-09-12 01:38:00 | googleapis/python-dns | https://api.github.com/repos/googleapis/python-dns | opened | tests.unit.test__http.TestConnection: test_build_api_url_w_extra_query_params failed | type: bug priority: p1 flakybot: issue | Note: #85 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 7bf0386cffba77c0a0b14865d28944f476e9c5e2
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/00af0d43-6328-4db5-8d99-3d1dbed23cb2), [Sponge](http://sponge2/00af0d43-6328-4db5-8d99-3d1dbed23cb2)
status: failed
<details><summary>Test output</summary><br><pre>self = <tests.unit.test__http.TestConnection testMethod=test_build_api_url_w_extra_query_params>
def test_build_api_url_w_extra_query_params(self):
from urllib.parse import parse_qsl
from urllib.parse import urlsplit
> conn = self._make_one(object())
tests/unit/test__http.py:63:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/unit/test__http.py:28: in _make_one
return self._get_target_class()(*args, **kw)
tests/unit/test__http.py:23: in _get_target_class
from google.cloud.dns._http import Connection
google/cloud/dns/__init__.py:32: in <module>
from google.cloud.dns.zone import Changes
google/cloud/dns/zone.py:18: in <module>
from google.cloud._helpers import _rfc3339_to_datetime
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
"""
from __future__ import absolute_import
import calendar
import datetime
import os
import re
from threading import local as Local
> import six
E ModuleNotFoundError: No module named 'six'
.nox/unit-3-7/lib/python3.7/site-packages/google/cloud/_helpers.py:28: ModuleNotFoundError</pre></details> | 1.0 | tests.unit.test__http.TestConnection: test_build_api_url_w_extra_query_params failed - Note: #85 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 7bf0386cffba77c0a0b14865d28944f476e9c5e2
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/00af0d43-6328-4db5-8d99-3d1dbed23cb2), [Sponge](http://sponge2/00af0d43-6328-4db5-8d99-3d1dbed23cb2)
status: failed
<details><summary>Test output</summary><br><pre>self = <tests.unit.test__http.TestConnection testMethod=test_build_api_url_w_extra_query_params>
def test_build_api_url_w_extra_query_params(self):
from urllib.parse import parse_qsl
from urllib.parse import urlsplit
> conn = self._make_one(object())
tests/unit/test__http.py:63:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/unit/test__http.py:28: in _make_one
return self._get_target_class()(*args, **kw)
tests/unit/test__http.py:23: in _get_target_class
from google.cloud.dns._http import Connection
google/cloud/dns/__init__.py:32: in <module>
from google.cloud.dns.zone import Changes
google/cloud/dns/zone.py:18: in <module>
from google.cloud._helpers import _rfc3339_to_datetime
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
"""
from __future__ import absolute_import
import calendar
import datetime
import os
import re
from threading import local as Local
> import six
E ModuleNotFoundError: No module named 'six'
.nox/unit-3-7/lib/python3.7/site-packages/google/cloud/_helpers.py:28: ModuleNotFoundError</pre></details> | priority | tests unit test http testconnection test build api url w extra query params failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output self def test build api url w extra query params self from urllib parse import parse qsl from urllib parse import urlsplit conn self make one object tests unit test http py tests unit test http py in make one return self get target class args kw tests unit test http py in get target class from google cloud dns http import connection google cloud dns init py in from google cloud dns zone import changes google cloud dns zone py in from google cloud helpers import to datetime from future import absolute import import calendar import datetime import os import re from threading import local as local import six e modulenotfounderror no module named six nox unit lib site packages google cloud helpers py modulenotfounderror | 1 |
52,285 | 10,817,491,366 | IssuesEvent | 2019-11-08 09:51:42 | MoonchildProductions/UXP | https://api.github.com/repos/MoonchildProductions/UXP | closed | Remove DiskSpaceWatcher | App: Toolkit Code Cleanup Fixed Low Risk | This component was only ever relevant for space-restricted devices (i.e. GONK/Firefox OS) and has never been enabled on any other target. This is unmaintained code with an unknown working state.
It lives in `/toolkit/components/diskspacewatcher` and has some call sites from DOM (for local storage restrictions if the disk is full, etc.). | 1.0 | Remove DiskSpaceWatcher - This component was only ever relevant for space-restricted devices (i.e. GONK/Firefox OS) and has never been enabled on any other target. This is unmaintained code with an unknown working state.
It lives in `/toolkit/components/diskspacewatcher` and has some call sites from DOM (for local storage restrictions if the disk is full, etc.). | non_priority | remove diskspacewatcher this component was only ever relevant for space restricted devices i e gonk firefox os and has never been enabled on any other target this is unmaintained code with an unknown working state it lives in toolkit components diskspacewatcher and has some call sites from dom for local storage restrictions if the disk is full etc | 0 |
62,780 | 12,240,706,310 | IssuesEvent | 2020-05-05 01:16:49 | microsoft/AdaptiveCards | https://api.github.com/repos/microsoft/AdaptiveCards | closed | [UWP][Accessibility] Focusable sibling elements must not have same and localized control type. | Bug Status-In Code Review Triage-Approved for Fix | # Platform
* UWP
# Version of SDK
master
# Details
Repro Steps:
1. Launch the application.
2. New screen starts appearing. Navigate to Left pane JSON files.
3. Now select "ExpenseReport.JSON" button. It will update the Middle and Right Pane.
4. Navigate to amount section like (Air Travel Expense $300, Auto Mobile Expense $100) etc. at right pane.
5. Verify name of each amount.
Actual Result:
There is no unique name defined for amount section like(Air Travel Expense $300, Auto Mobile Expense $100) etc. at right pane of Left Pane JSON Files and due to this, screen reader only announce Expand/collapse button which creates confusion for the screen reader user.
Expected Result:
There should be unique names for sibling controls(Air Travel Expense $300, Auto Mobile Expense $100) etc. that have the same Control type property so that screen reader user could easily identify unique control. | 1.0 | [UWP][Accessibility] Focusable sibling elements must not have same and localized control type. - # Platform
* UWP
# Version of SDK
master
# Details
Repro Steps:
1. Launch the application.
2. New screen starts appearing. Navigate to Left pane JSON files.
3. Now select "ExpenseReport.JSON" button. It will update the Middle and Right Pane.
4. Navigate to amount section like (Air Travel Expense $300, Auto Mobile Expense $100) etc. at right pane.
5. Verify name of each amount.
Actual Result:
There is no unique name defined for amount section like(Air Travel Expense $300, Auto Mobile Expense $100) etc. at right pane of Left Pane JSON Files and due to this, screen reader only announce Expand/collapse button which creates confusion for the screen reader user.
Expected Result:
There should be unique names for sibling controls(Air Travel Expense $300, Auto Mobile Expense $100) etc. that have the same Control type property so that screen reader user could easily identify unique control. | non_priority | focusable sibling elements must not have same and localized control type platform uwp version of sdk master details repro steps launch the application new screen starts appearing navigate to left pane json files now select expensereport json button it will update the middle and right pane navigate to amount section like air travel expense auto mobile expense etc at right pane verify name of each amount actual result there is no unique name defined for amount section like air travel expense auto mobile expense etc at right pane of left pane json files and due to this screen reader only announce expand collapse button which creates confusion for the screen reader user expected result there should be unique names for sibling controls air travel expense auto mobile expense etc that have the same control type property so that screen reader user could easily identify unique control | 0 |
207,178 | 7,125,398,021 | IssuesEvent | 2018-01-19 22:50:56 | sul-dlss/preservation_catalog | https://api.github.com/repos/sul-dlss/preservation_catalog | closed | (C2M) SQL query for looping through catalog entries | high priority in progress | (See PR #459 if it hasn't been merged yet)
We will be selecting PreservedCopy objects by storage_dir / endpoint and by last_version_audit date. We want a SQL query to get us the correct PreservedCopy objects, but it must be done in a scalable way (i.e. chunked).
@jmartin-sul says: "re: the C2M query, i think you could use plain `find` if you just wanted a list of IDs to loop over, but passing a block to `find_each` was the more efficient approach, IIRC. because then you don't have to go back and do a `find` on each object ID (i assume that `find_each` does some paging under the hood, and that it'll make multiple DB connections, but fewer connections than `find` to get the list of IDs and `find` on each returned ID). one more random suggestion: if you only want the last N records, i'd have the query do the sort/limit (as opposed to querying broadly and filtering the list in ruby-land). presumably we'll be querying on indexed cols, which should be pretty efficient." | 1.0 | (C2M) SQL query for looping through catalog entries - (See PR #459 if it hasn't been merged yet)
We will be selecting PreservedCopy objects by storage_dir / endpoint and by last_version_audit date. We want a SQL query to get us the correct PreservedCopy objects, but it must be done in a scalable way (i.e. chunked).
@jmartin-sul says: "re: the C2M query, i think you could use plain `find` if you just wanted a list of IDs to loop over, but passing a block to `find_each` was the more efficient approach, IIRC. because then you don't have to go back and do a `find` on each object ID (i assume that `find_each` does some paging under the hood, and that it'll make multiple DB connections, but fewer connections than `find` to get the list of IDs and `find` on each returned ID). one more random suggestion: if you only want the last N records, i'd have the query do the sort/limit (as opposed to querying broadly and filtering the list in ruby-land). presumably we'll be querying on indexed cols, which should be pretty efficient." | priority | sql query for looping through catalog entries see pr if it hasn t been merged yet we will be selecting preservedcopy objects by storage dir endpoint and by last version audit date we want a sql query to get us the correct preservedcopy objects but it must be done in a scalable way i e chunked jmartin sul says re the query i think you could use plain find if you just wanted a list of ids to loop over but passing a block to find each was the more efficient approach iirc because then you don t have to go back and do a find on each object id i assume that find each does some paging under the hood and that it ll make multiple db connections but fewer connections than find to get the list of ids and find on each returned id one more random suggestion if you only want the last n records i d have the query do the sort limit as opposed to querying broadly and filtering the list in ruby land presumably we ll be querying on indexed cols which should be pretty efficient | 1 |
70,717 | 13,527,485,773 | IssuesEvent | 2020-09-15 15:28:28 | Perl/perl5 | https://api.github.com/repos/Perl/perl5 | closed | study_chunk recursion | Needs Triage meta-regexp-code | This is a placeholder ticket for consideration of a theoretically possible bug.
In #16947 we found that study_chunk reinvokes itself in two ways - by simple recursion, and by enframing. In some cases that involves restudying regexp ops multiple times, whereas in other cases the reinvocation is the only time the relevant ops are studied. The primary results of studying are a) to capture global information about the regexp that will be used for optimization at runtime; b) to make in-place modifications to the ops for optimization (optional but desirable); and c) to make mandatory modifications to the ops, replacing temporary compile-time-only ops that the runtime engine does not know how to handle.
Because of (c) it is required that every op is studied at least once.
When ops are studied multiple times that can cause problems: the first invocation may capture information about the program, then reinvoke, then attempt to use the captured information assuming it has not changed.
The conclusion is that mutation of ops must happen only once, at the outermost level of reinvocation that will act on the relevant ops.
As far as I was able to discover the only case in which ops are studied multiple times is in the handling of GOSUB, which reinvokes by enframing. In #16947 this was resolved by recording in each frame whether it, or any outer frame, represented the handling of a GOSUB, and suppressing all mutating changes if so (confident that the same ops will be studied at some point in some outer frame that is not within the handling of a GOSUB).
When we reinvoke by recursion, however, any frames used by the caller are not visible to the callee; as such it may still be possible to trigger the same types of problem if reinvocation involves a mix of enframing and recursion.
Extending the fix from 089ad25d3f to handle this case would involve adding an extra boolean argument `was_mutate_ok` to study_chunk. All principal calls would pass this in as 0; all recursive calls would pass in the local value of `mutate_ok`; and the setting of `mutate_ok` would change to:
```
bool mutate_ok = (was_mutate_ok && (!frame || !frame->in_gosub));
```
I don't intend to make such a change unless we find a testcase to show this is a real rather than a theoretical problem. | 1.0 | study_chunk recursion - This is a placeholder ticket for consideration of a theoretically possible bug.
In #16947 we found that study_chunk reinvokes itself in two ways - by simple recursion, and by enframing. In some cases that involves restudying regexp ops multiple times, whereas in other cases the reinvocation is the only time the relevant ops are studied. The primary results of studying are a) to capture global information about the regexp that will be used for optimization at runtime; b) to make in-place modifications to the ops for optimization (optional but desirable); and c) to make mandatory modifications to the ops, replacing temporary compile-time-only ops that the runtime engine does not know how to handle.
Because of (c) it is required that every op is studied at least once.
When ops are studied multiple times that can cause problems: the first invocation may capture information about the program, then reinvoke, then attempt to use the captured information assuming it has not changed.
The conclusion is that mutation of ops must happen only once, at the outermost level of reinvocation that will act on the relevant ops.
As far as I was able to discover the only case in which ops are studied multiple times is in the handling of GOSUB, which reinvokes by enframing. In #16947 this was resolved by recording in each frame whether it, or any outer frame, represented the handling of a GOSUB, and suppressing all mutating changes if so (confident that the same ops will be studied at some point in some outer frame that is not within the handling of a GOSUB).
When we reinvoke by recursion, however, any frames used by the caller are not visible to the callee; as such it may still be possible to trigger the same types of problem if reinvocation involves a mix of enframing and recursion.
Extending the fix from 089ad25d3f to handle this case would involve adding an extra boolean argument `was_mutate_ok` to study_chunk. All principal calls would pass this in as 0; all recursive calls would pass in the local value of `mutate_ok`; and the setting of `mutate_ok` would change to:
```
bool mutate_ok = (was_mutate_ok && (!frame || !frame->in_gosub));
```
I don't intend to make such a change unless we find a testcase to show this is a real rather than a theoretical problem. | non_priority | study chunk recursion this is a placeholder ticket for consideration of a theoretically possible bug in we found that study chunk reinvokes itself in two ways by simple recursion and by enframing in some cases that involves restudying regexp ops multiple times whereas in other cases the reinvocation is the only time the relevant ops are studied the primary results of studying are a to capture global information about the regexp that will be used for optimization at runtime b to make in place modifications to the ops for optimization optional but desirable and c to make mandatory modifications to the ops replacing temporary compile time only ops that the runtime engine does not know how to handle because of c it is required that every op is studied at least once when ops are studied multiple times that can cause problems the first invocation may capture information about the program then reinvoke then attempt to use the captured information assuming it has not changed the conclusion is that mutation of ops must happen only once at the outermost level of reinvocation that will act on the relevant ops as far as i was able to discover the only case in which ops are studied multiple times is in the handling of gosub which reinvokes by enframing in this was resolved by recording in each frame whether it or any outer frame represented the handling of a gosub and suppressing all mutating changes if so confident that the same ops will be studied at some point in some outer frame that is not within the handling of a gosub when we reinvoke by recursion however any frames used by the caller are not visible to the callee as such it may still be possible to trigger the same types of problem if reinvocation involves a mix of enframing and recursion extending the fix from to handle this case would involve adding an extra boolean argument was mutate ok to study chunk all principal calls would pass this in as all recursive calls would pass in the local value of mutate ok and the setting of mutate ok would change to bool mutate ok was mutate ok frame frame in gosub i don t intend to make such a change unless we find a testcase to show this is a real rather than a theoretical problem | 0 |
375,669 | 11,115,104,711 | IssuesEvent | 2019-12-18 10:03:47 | ooni/probe-engine | https://api.github.com/repos/ooni/probe-engine | closed | QA for Telegram in Go | cycle backlog effort/M priority/high technical task | This issue is about doing a final round of QA for Telegram in Go and replace the C++ implementation in MK with the implementation in Go, if we're satisfied. | 1.0 | QA for Telegram in Go - This issue is about doing a final round of QA for Telegram in Go and replace the C++ implementation in MK with the implementation in Go, if we're satisfied. | priority | qa for telegram in go this issue is about doing a final round of qa for telegram in go and replace the c implementation in mk with the implementation in go if we re satisfied | 1 |
659,251 | 21,920,721,036 | IssuesEvent | 2022-05-22 14:27:53 | stax76/mpv.net | https://api.github.com/repos/stax76/mpv.net | closed | The autoload function is not working normally | feature request priority medium | **Describe the bug**
When files are recovered not through the menu but through other scripts, other files in the same directory cannot be loaded automatically. For example, this restriction will be triggered when using the script function of [SmartHistory.lua](https://github.com/Eisa01/mpv-scripts#smarthistory-script) or [SmartCopyPaste-II.lua](https://github.com/Eisa01/mpv-scripts#smartcopypaste-ii-script). but the [autoload.lua](https://github.com/mpv-player/mpv/blob/master/TOOLS/lua/autoload.lua) works normally
mpv.net version: 5.5.0.3 Beta
**To Reproduce**
Steps to reproduce the behavior:
1. Use SmartHistory.lua or SmartCopyPaste-II.lua
2. Use the above script functions
3. Other files in the same directory cannot be loaded automatically
**Expected behavior**
Automatic loading of files in the same directory
**Screenshots**

**Additional context**
When the autoload.lua script is used to load the file, everything works normally | 1.0 | The autoload function is not working normally - **Describe the bug**
When files are recovered not through the menu but through other scripts, other files in the same directory cannot be loaded automatically. For example, this restriction will be triggered when using the script function of [SmartHistory.lua](https://github.com/Eisa01/mpv-scripts#smarthistory-script) or [SmartCopyPaste-II.lua](https://github.com/Eisa01/mpv-scripts#smartcopypaste-ii-script). but the [autoload.lua](https://github.com/mpv-player/mpv/blob/master/TOOLS/lua/autoload.lua) works normally
mpv.net version: 5.5.0.3 Beta
**To Reproduce**
Steps to reproduce the behavior:
1. Use SmartHistory.lua or SmartCopyPaste-II.lua
2. Use the above script functions
3. Other files in the same directory cannot be loaded automatically
**Expected behavior**
Automatic loading of files in the same directory
**Screenshots**

**Additional context**
When the autoload.lua script is used to load the file, everything works normally | priority | the autoload function is not working normally describe the bug when files are recovered not through the menu but through other scripts other files in the same directory cannot be loaded automatically for example this restriction will be triggered when using the script function of or but the works normally mpv net version beta to reproduce steps to reproduce the behavior use smarthistory lua or smartcopypaste ii lua use the above script functions other files in the same directory cannot be loaded automatically expected behavior automatic loading of files in the same directory screenshots additional context when the autoload lua script is used to load the file everything works normally | 1 |
139,434 | 12,856,996,219 | IssuesEvent | 2020-07-09 08:35:19 | xarial/docify | https://api.github.com/repos/xarial/docify | opened | Add merge option to code snippet plugin | documentation library | This option should merge the snippets created as the result of excluding regions or multiple regions into a single snippet without jagged lines | 1.0 | Add merge option to code snippet plugin - This option should merge the snippets created as the result of excluding regions or multiple regions into a single snippet without jagged lines | non_priority | add merge option to code snippet plugin this option should merge the snippets created as the result of excluding regions or multiple regions into a single snippet without jagged lines | 0 |
20,296 | 26,933,368,383 | IssuesEvent | 2023-02-07 18:30:50 | scverse/anndata | https://api.github.com/repos/scverse/anndata | opened | Remove workaround from test_concat_size_0_dim once upstream bug fixed | upstream topic: combining Bug 🐛 dev process | * https://github.com/scikit-hep/awkward/issues/2209
Remove marked workaround from `test_concat_size_0_dim` once the upstream bug from awkward is fixed, and a release made. | 1.0 | Remove workaround from test_concat_size_0_dim once upstream bug fixed - * https://github.com/scikit-hep/awkward/issues/2209
Remove marked workaround from `test_concat_size_0_dim` once the upstream bug from awkward is fixed, and a release made. | non_priority | remove workaround from test concat size dim once upstream bug fixed remove marked workaround from test concat size dim once the upstream bug from awkward is fixed and a release made | 0 |
275,204 | 20,912,436,470 | IssuesEvent | 2022-03-24 10:31:39 | JoGorska/bonsai-shop | https://api.github.com/repos/JoGorska/bonsai-shop | opened | DOCS: html validation | documentation | - validate all pages
- download prof of validation
- update readme with validation result
| 1.0 | DOCS: html validation - - validate all pages
- download prof of validation
- update readme with validation result
| non_priority | docs html validation validate all pages download prof of validation update readme with validation result | 0 |
376,635 | 11,149,594,234 | IssuesEvent | 2019-12-23 19:16:32 | texas-justice-initiative/website-nextjs | https://api.github.com/repos/texas-justice-initiative/website-nextjs | opened | make change to Eva Ruth's bio | high priority | Please make the following change to Eva Ruth's bio on "About Us" (no longer writing the book, thankfully!!)
> Executive Director and co-founder Eva Ruth Moravec is a 2018 John Jay/Harry Frank Guggenheim Criminal Justice Reporting fellow, **and a** freelance reporter `<del>`and the author of a forthcoming book that explores the legality of police shootings `</del>` **covering criminal justice** in Texas **and throughout the U.S.**. | 1.0 | make change to Eva Ruth's bio - Please make the following change to Eva Ruth's bio on "About Us" (no longer writing the book, thankfully!!)
> Executive Director and co-founder Eva Ruth Moravec is a 2018 John Jay/Harry Frank Guggenheim Criminal Justice Reporting fellow, **and a** freelance reporter `<del>`and the author of a forthcoming book that explores the legality of police shootings `</del>` **covering criminal justice** in Texas **and throughout the U.S.**. | priority | make change to eva ruth s bio please make the following change to eva ruth s bio on about us no longer writing the book thankfully executive director and co founder eva ruth moravec is a john jay harry frank guggenheim criminal justice reporting fellow and a freelance reporter and the author of a forthcoming book that explores the legality of police shootings covering criminal justice in texas and throughout the u s | 1 |
832,392 | 32,078,554,066 | IssuesEvent | 2023-09-25 12:37:33 | AY2324S1-CS2103T-T10-2/tp | https://api.github.com/repos/AY2324S1-CS2103T-T10-2/tp | opened | As an Event Planner, I can search for my contacts | type.Story priority.High | ... so that I can quickly locate the details of contact I want.
Tasks:
- [ ] Implement searching of contacts `findPerson` @jiakai-17
- [ ] Add `findPerson` to User Guide @jiakai-17 | 1.0 | As an Event Planner, I can search for my contacts - ... so that I can quickly locate the details of contact I want.
Tasks:
- [ ] Implement searching of contacts `findPerson` @jiakai-17
- [ ] Add `findPerson` to User Guide @jiakai-17 | priority | as an event planner i can search for my contacts so that i can quickly locate the details of contact i want tasks implement searching of contacts findperson jiakai add findperson to user guide jiakai | 1 |
203,018 | 15,339,770,912 | IssuesEvent | 2021-02-27 03:29:30 | rancher/dashboard | https://api.github.com/repos/rancher/dashboard | closed | Cronjob is not counting successful amount in details | [zube]: To Test area/workloads kind/bug | Steps:
1. Create a cronjob that will fire every few minutes.
2. After about 10 minutes go to the details page
Results: The job fires successfully, but the count still says 0

| 1.0 | Cronjob is not counting successful amount in details - Steps:
1. Create a cronjob that will fire every few minutes.
2. After about 10 minutes go to the details page
Results: The job fires successfully, but the count still says 0

| non_priority | cronjob is not counting successful amount in details steps create a cronjob that will fire every few minutes after about minutes go to the details page results the job fires successfully but the count still says | 0 |
711,294 | 24,457,171,887 | IssuesEvent | 2022-10-07 07:53:48 | insightsengineering/tern.mmrm | https://api.github.com/repos/insightsengineering/tern.mmrm | closed | Add `weights` option to `fit_mmrm` etc. | SP2 high priority | Idea: Expose the `weights` option from `mmrm`.
To do:
- [x] `h_assert_data()`
- [x] `h_labels()`
- [x] `fit_mmrm()`
- [x] `g_mmrm_diagnostic()1
- [x] document
- [x] Add tests | 1.0 | Add `weights` option to `fit_mmrm` etc. - Idea: Expose the `weights` option from `mmrm`.
To do:
- [x] `h_assert_data()`
- [x] `h_labels()`
- [x] `fit_mmrm()`
- [x] `g_mmrm_diagnostic()1
- [x] document
- [x] Add tests | priority | add weights option to fit mmrm etc idea expose the weights option from mmrm to do h assert data h labels fit mmrm g mmrm diagnostic document add tests | 1 |
568,323 | 16,964,751,154 | IssuesEvent | 2021-06-29 09:30:52 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | mail.yahoo.com - design is broken | browser-firefox engine-gecko os-mac priority-critical | <!-- @browser: Firefox 90.0 -->
<!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:90.0) Gecko/20100101 Firefox/90.0 -->
<!-- @reported_with: desktop-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/78239 -->
**URL**: https://mail.yahoo.com/d/folders/1/messages/38474
**Browser / Version**: Firefox 90.0
**Operating System**: Mac OS X 10.15
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items not fully visible
**Steps to Reproduce**:
Yahoo mail preview screen will not load the entire document to print from Yahoo mail.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20210622185930</li><li>channel: beta</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2021/6/bf670854-cb7f-486d-a548-8a028d37fbe3)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | mail.yahoo.com - design is broken - <!-- @browser: Firefox 90.0 -->
<!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:90.0) Gecko/20100101 Firefox/90.0 -->
<!-- @reported_with: desktop-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/78239 -->
**URL**: https://mail.yahoo.com/d/folders/1/messages/38474
**Browser / Version**: Firefox 90.0
**Operating System**: Mac OS X 10.15
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items not fully visible
**Steps to Reproduce**:
Yahoo mail preview screen will not load the entire document to print from Yahoo mail.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20210622185930</li><li>channel: beta</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2021/6/bf670854-cb7f-486d-a548-8a028d37fbe3)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | mail yahoo com design is broken url browser version firefox operating system mac os x tested another browser yes safari problem type design is broken description items not fully visible steps to reproduce yahoo mail preview screen will not load the entire document to print from yahoo mail browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen false mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 1 |
143,051 | 19,142,713,398 | IssuesEvent | 2021-12-02 01:56:51 | azmathasan92/concourse-ci-cd | https://api.github.com/repos/azmathasan92/concourse-ci-cd | opened | CVE-2021-22096 (Medium) detected in spring-webflux-5.0.8.RELEASE.jar, spring-web-5.0.8.RELEASE.jar | security vulnerability | ## CVE-2021-22096 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-webflux-5.0.8.RELEASE.jar</b>, <b>spring-web-5.0.8.RELEASE.jar</b></p></summary>
<p>
<details><summary><b>spring-webflux-5.0.8.RELEASE.jar</b></p></summary>
<p>Spring WebFlux</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: concourse-ci-cd/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-webflux/5.0.8.RELEASE/spring-webflux-5.0.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.0.4.RELEASE.jar (Root Library)
- :x: **spring-webflux-5.0.8.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-web-5.0.8.RELEASE.jar</b></p></summary>
<p>Spring Web</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: concourse-ci-cd/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/5.0.8.RELEASE/spring-web-5.0.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.0.4.RELEASE.jar (Root Library)
- :x: **spring-web-5.0.8.RELEASE.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework versions 5.3.0 - 5.3.10, 5.2.0 - 5.2.17, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries.
<p>Publish Date: 2021-10-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-22096>CVE-2021-22096</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2021-22096">https://tanzu.vmware.com/security/cve-2021-22096</a></p>
<p>Release Date: 2021-10-28</p>
<p>Fix Resolution: org.springframework:spring:5.2.18.RELEASE,5.3.12</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-22096 (Medium) detected in spring-webflux-5.0.8.RELEASE.jar, spring-web-5.0.8.RELEASE.jar - ## CVE-2021-22096 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-webflux-5.0.8.RELEASE.jar</b>, <b>spring-web-5.0.8.RELEASE.jar</b></p></summary>
<p>
<details><summary><b>spring-webflux-5.0.8.RELEASE.jar</b></p></summary>
<p>Spring WebFlux</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: concourse-ci-cd/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-webflux/5.0.8.RELEASE/spring-webflux-5.0.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.0.4.RELEASE.jar (Root Library)
- :x: **spring-webflux-5.0.8.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-web-5.0.8.RELEASE.jar</b></p></summary>
<p>Spring Web</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: concourse-ci-cd/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/5.0.8.RELEASE/spring-web-5.0.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.0.4.RELEASE.jar (Root Library)
- :x: **spring-web-5.0.8.RELEASE.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework versions 5.3.0 - 5.3.10, 5.2.0 - 5.2.17, and older unsupported versions, it is possible for a user to provide malicious input to cause the insertion of additional log entries.
<p>Publish Date: 2021-10-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-22096>CVE-2021-22096</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2021-22096">https://tanzu.vmware.com/security/cve-2021-22096</a></p>
<p>Release Date: 2021-10-28</p>
<p>Fix Resolution: org.springframework:spring:5.2.18.RELEASE,5.3.12</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in spring webflux release jar spring web release jar cve medium severity vulnerability vulnerable libraries spring webflux release jar spring web release jar spring webflux release jar spring webflux library home page a href path to dependency file concourse ci cd pom xml path to vulnerable library home wss scanner repository org springframework spring webflux release spring webflux release jar dependency hierarchy spring boot starter webflux release jar root library x spring webflux release jar vulnerable library spring web release jar spring web library home page a href path to dependency file concourse ci cd pom xml path to vulnerable library home wss scanner repository org springframework spring web release spring web release jar dependency hierarchy spring boot starter webflux release jar root library x spring web release jar vulnerable library found in base branch master vulnerability details in spring framework versions and older unsupported versions it is possible for a user to provide malicious input to cause the insertion of additional log entries publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring release step up your open source security game with whitesource | 0 |
72,239 | 8,712,605,268 | IssuesEvent | 2018-12-06 22:48:46 | kowhai-2018/Final-Project | https://api.github.com/repos/kowhai-2018/Final-Project | closed | Design a prototype/mockup of app using design guides | client design documentation | Have a wiki page dedicated to the design specs including all the different types of design guides, colours etc from #2 .
Relates to #2 | 1.0 | Design a prototype/mockup of app using design guides - Have a wiki page dedicated to the design specs including all the different types of design guides, colours etc from #2 .
Relates to #2 | non_priority | design a prototype mockup of app using design guides have a wiki page dedicated to the design specs including all the different types of design guides colours etc from relates to | 0 |
120,595 | 10,129,799,477 | IssuesEvent | 2019-08-01 15:30:20 | w3c/webrtc-pc | https://api.github.com/repos/w3c/webrtc-pc | closed | callback-based https://w3c.github.io/webrtc-pc/#method-extensions are not covered by WPT tests | Needs Test question | Looking at WPT webrtc tests, I do not find any testing for callback-based createOffer and addIceCandidate.
Are there already tests covering these somewhere?
It would be nice to add such WPT tests if not already covered. | 1.0 | callback-based https://w3c.github.io/webrtc-pc/#method-extensions are not covered by WPT tests - Looking at WPT webrtc tests, I do not find any testing for callback-based createOffer and addIceCandidate.
Are there already tests covering these somewhere?
It would be nice to add such WPT tests if not already covered. | non_priority | callback based are not covered by wpt tests looking at wpt webrtc tests i do not find any testing for callback based createoffer and addicecandidate are there already tests covering these somewhere it would be nice to add such wpt tests if not already covered | 0 |
469,289 | 13,505,002,390 | IssuesEvent | 2020-09-13 20:40:00 | apexcharts/apexcharts.js | https://api.github.com/repos/apexcharts/apexcharts.js | reopened | Chart height on mobile chrome/brave not correct (depending on number of lines in legend) | bug high-priority legend mobile | ## Codepen
https://codepen.io/guybrush-the-bold/full/LoXdgN
## Explanation
- What is the behavior you expect?
The chart should look the same on all devices
- What is happening instead?
On chrome (and brave) the height of the chart-plot is much less than on other devices - depending on the amount of lines in the legend.
- What error message are you getting?
None
Here is a screenshot: (on the left its chrome on desktop, on the right its chrome on android)

| 1.0 | Chart height on mobile chrome/brave not correct (depending on number of lines in legend) - ## Codepen
https://codepen.io/guybrush-the-bold/full/LoXdgN
## Explanation
- What is the behavior you expect?
The chart should look the same on all devices
- What is happening instead?
On chrome (and brave) the height of the chart-plot is much less than on other devices - depending on the amount of lines in the legend.
- What error message are you getting?
None
Here is a screenshot: (on the left its chrome on desktop, on the right its chrome on android)

| priority | chart height on mobile chrome brave not correct depending on number of lines in legend codepen explanation what is the behavior you expect the chart should look the same on all devices what is happening instead on chrome and brave the height of the chart plot is much less than on other devices depending on the amount of lines in the legend what error message are you getting none here is a screenshot on the left its chrome on desktop on the right its chrome on android | 1 |
633,803 | 20,266,206,910 | IssuesEvent | 2022-02-15 12:19:20 | tendermint/starport | https://api.github.com/repos/tendermint/starport | opened | network: stabilize HTTP tunneling | priority/high network | The feature first implemented by https://github.com/tendermint/starport/pull/2055.
There is a connectivity problem with more than two nodes when Gitpod and VM nodes used together.
@ivanovpetr also reported that:
> And I also found out that gitpod node id changes every time when it stopped
Let's ensure that
- [ ] We can run a network with 2 Gitpod and 2 VM instances.
- [ ] We can run a network with 4 Gitpod instances.
- [ ] We can run a network with 4 VM instances.
| 1.0 | network: stabilize HTTP tunneling - The feature first implemented by https://github.com/tendermint/starport/pull/2055.
There is a connectivity problem with more than two nodes when Gitpod and VM nodes used together.
@ivanovpetr also reported that:
> And I also found out that gitpod node id changes every time when it stopped
Let's ensure that
- [ ] We can run a network with 2 Gitpod and 2 VM instances.
- [ ] We can run a network with 4 Gitpod instances.
- [ ] We can run a network with 4 VM instances.
| priority | network stabilize http tunneling the feature first implemented by there is a connectivity problem with more than two nodes when gitpod and vm nodes used together ivanovpetr also reported that and i also found out that gitpod node id changes every time when it stopped let s ensure that we can run a network with gitpod and vm instances we can run a network with gitpod instances we can run a network with vm instances | 1 |
177,781 | 21,509,183,168 | IssuesEvent | 2022-04-28 01:13:31 | rgordon95/advanced-react-demo | https://api.github.com/repos/rgordon95/advanced-react-demo | opened | WS-2021-0153 (High) detected in ejs-2.5.6.tgz | security vulnerability | ## WS-2021-0153 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.5.6.tgz</b></p></summary>
<p>Embedded JavaScript templates</p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.5.6.tgz">https://registry.npmjs.org/ejs/-/ejs-2.5.6.tgz</a></p>
<p>Path to dependency file: /advanced-react-demo/package.json</p>
<p>Path to vulnerable library: /node_modules/ejs/package.json</p>
<p>
Dependency Hierarchy:
- :x: **ejs-2.5.6.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display.
<p>Publish Date: 2021-01-22
<p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p>
<p>Release Date: 2021-01-22</p>
<p>Fix Resolution: 3.1.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2021-0153 (High) detected in ejs-2.5.6.tgz - ## WS-2021-0153 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.5.6.tgz</b></p></summary>
<p>Embedded JavaScript templates</p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.5.6.tgz">https://registry.npmjs.org/ejs/-/ejs-2.5.6.tgz</a></p>
<p>Path to dependency file: /advanced-react-demo/package.json</p>
<p>Path to vulnerable library: /node_modules/ejs/package.json</p>
<p>
Dependency Hierarchy:
- :x: **ejs-2.5.6.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display.
<p>Publish Date: 2021-01-22
<p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p>
<p>Release Date: 2021-01-22</p>
<p>Fix Resolution: 3.1.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws high detected in ejs tgz ws high severity vulnerability vulnerable library ejs tgz embedded javascript templates library home page a href path to dependency file advanced react demo package json path to vulnerable library node modules ejs package json dependency hierarchy x ejs tgz vulnerable library vulnerability details arbitrary code injection vulnerability was found in ejs before caused by filename which isn t sanitized for display publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
37,765 | 18,764,850,412 | IssuesEvent | 2021-11-05 21:40:21 | Azure/azure-functions-host | https://api.github.com/repos/Azure/azure-functions-host | closed | Dynamic Concurrency Support | performance | ## Overview
This epic item tracks the work across the various repos for enabling **Dynamic Concurrency** (DC). The idea for DC is to define a collaborative model between the host and extensions to allow extensions to support a dynamic concurrency mode. In this mode, rather than the user manually configuring host level concurrency knobs for various extensions (e.g. BatchSize/NewBatchThreshold for Queues), the extension would dynamically adjust concurrency at runtime to find optimal values based on host health metrics. DC is a core WebJobs SDK feature that will be leveraged by Azure Functions.
## Goals
- **Simplify configuration**: don't require user to learn/set a bunch of config knobs to throttle a function down to protect machine resources. Similarly, the other way - to increase single instance concurrency, often knobs have to be adjusted up (e.g. SB PrefetchCount/MaxConcurrentCalls). Right now, a customer must use trial and error to arrive at the right numbers here. For a particular workload, it’s very hard to arrive at a configuration that maximizes throughput.
- **Protect the host from overload**: A single function might only be able to process 2 invocations concurrently, while defaults allow for more. This means that the host will become overloaded unless the user goes in and reduces concurrency via config.
- **Improve throughput**: single instance throttling can help increase overall throughput by ensuring individual instances don't eagerly pull more work than they can quickly process. I.e., we want to avoid prefetching work on a single instance that can't process soon, so it can be load balanced out to other workers.
## Planning
Target for initial release is to enable DC for the Queue and ServiceBus extensions. We'll onboard other extensions as we can. Following is a list of the main work items to complete before the first version can be released:
- **WebJobs SDK**
- [x] ConcurrencyManager
- [x] ConcurrencyOptions configuration - define configurable aspects
- [x] IHostHealthMonitor and default impl. for monitoring host/child process stats (used by throttles)
- [x] ConcurrencyManagerService (IHostedService) responsible for startup init of ConcurrencyManager, snapshot init, periodic snapshot storage on primary instance, etc.
- [x] CPU Throttle
- [x] Memory Throttle
- [x] ThreadPool Starvation Throttle
- [x] Snapshot Status Persistence (IConcurrencyStatusRepository) and blob storage implementation
- [x] Move Functions ProcessMonitor into core SDK
- [x] Move Functions IPrimaryHostStateProvider and related types (e.g. PrimaryHostCoordinator) into core SDK (needed for DC status snapshot persistence)
- **Azure Queue extension (Track2)**
- [x] Update QueueListener to use ConcurrencyManager based on ConcurrencyOptions
- **Service Bus extension (Track2)**
- [x] Update ServiceBusListener to use ConcurrencyManager based on ConcurrencyOptions
- **Functions Host**
- [x] Move required types to WebJobs SDK as noted above (ProcessMonitor, IPrimaryHostStateProvider, etc.)
- [x] Implement/register OOP worker channel throttle provider
- [x] Integrate OOP worker management with IHostHealthMonitor (register/unregister child processes for monitoring)
- [x] Integrate throttle results with `admin/host/ping` endpoint for scale out
- [x] Implement OOP worker process management improvements such that the existing worker configuration isn't required (e.g. process count/thread count). Instead, the host should just optimize things dynamically. | True | Dynamic Concurrency Support - ## Overview
This epic item tracks the work across the various repos for enabling **Dynamic Concurrency** (DC). The idea for DC is to define a collaborative model between the host and extensions to allow extensions to support a dynamic concurrency mode. In this mode, rather than the user manually configuring host level concurrency knobs for various extensions (e.g. BatchSize/NewBatchThreshold for Queues), the extension would dynamically adjust concurrency at runtime to find optimal values based on host health metrics. DC is a core WebJobs SDK feature that will be leveraged by Azure Functions.
## Goals
- **Simplify configuration**: don't require user to learn/set a bunch of config knobs to throttle a function down to protect machine resources. Similarly, the other way - to increase single instance concurrency, often knobs have to be adjusted up (e.g. SB PrefetchCount/MaxConcurrentCalls). Right now, a customer must use trial and error to arrive at the right numbers here. For a particular workload, it’s very hard to arrive at a configuration that maximizes throughput.
- **Protect the host from overload**: A single function might only be able to process 2 invocations concurrently, while defaults allow for more. This means that the host will become overloaded unless the user goes in and reduces concurrency via config.
- **Improve throughput**: single instance throttling can help increase overall throughput by ensuring individual instances don't eagerly pull more work than they can quickly process. I.e., we want to avoid prefetching work on a single instance that can't process soon, so it can be load balanced out to other workers.
## Planning
Target for initial release is to enable DC for the Queue and ServiceBus extensions. We'll onboard other extensions as we can. Following is a list of the main work items to complete before the first version can be released:
- **WebJobs SDK**
- [x] ConcurrencyManager
- [x] ConcurrencyOptions configuration - define configurable aspects
- [x] IHostHealthMonitor and default impl. for monitoring host/child process stats (used by throttles)
- [x] ConcurrencyManagerService (IHostedService) responsible for startup init of ConcurrencyManager, snapshot init, periodic snapshot storage on primary instance, etc.
- [x] CPU Throttle
- [x] Memory Throttle
- [x] ThreadPool Starvation Throttle
- [x] Snapshot Status Persistence (IConcurrencyStatusRepository) and blob storage implementation
- [x] Move Functions ProcessMonitor into core SDK
- [x] Move Functions IPrimaryHostStateProvider and related types (e.g. PrimaryHostCoordinator) into core SDK (needed for DC status snapshot persistence)
- **Azure Queue extension (Track2)**
- [x] Update QueueListener to use ConcurrencyManager based on ConcurrencyOptions
- **Service Bus extension (Track2)**
- [x] Update ServiceBusListener to use ConcurrencyManager based on ConcurrencyOptions
- **Functions Host**
- [x] Move required types to WebJobs SDK as noted above (ProcessMonitor, IPrimaryHostStateProvider, etc.)
- [x] Implement/register OOP worker channel throttle provider
- [x] Integrate OOP worker management with IHostHealthMonitor (register/unregister child processes for monitoring)
- [x] Integrate throttle results with `admin/host/ping` endpoint for scale out
- [x] Implement OOP worker process management improvements such that the existing worker configuration isn't required (e.g. process count/thread count). Instead, the host should just optimize things dynamically. | non_priority | dynamic concurrency support overview this epic item tracks the work across the various repos for enabling dynamic concurrency dc the idea for dc is to define a collaborative model between the host and extensions to allow extensions to support a dynamic concurrency mode in this mode rather than the user manually configuring host level concurrency knobs for various extensions e g batchsize newbatchthreshold for queues the extension would dynamically adjust concurrency at runtime to find optimal values based on host health metrics dc is a core webjobs sdk feature that will be leveraged by azure functions goals simplify configuration don t require user to learn set a bunch of config knobs to throttle a function down to protect machine resources similarly the other way to increase single instance concurrency often knobs have to be adjusted up e g sb prefetchcount maxconcurrentcalls right now a customer must use trial and error to arrive at the right numbers here for a particular workload it’s very hard to arrive at a configuration that maximizes throughput protect the host from overload a single function might only be able to process invocations concurrently while defaults allow for more this means that the host will become overloaded unless the user goes in and reduces concurrency via config improve throughput single instance throttling can help increase overall throughput by ensuring individual instances don t eagerly pull more work than they can quickly process i e we want to avoid prefetching work on a single instance that can t process soon so it can be load balanced out to other workers planning target for initial release is to enable dc for the queue and servicebus extensions we ll onboard other extensions as we can following is a list of the main work items to complete before the first version can be released webjobs sdk concurrencymanager concurrencyoptions configuration define configurable aspects ihosthealthmonitor and default impl for monitoring host child process stats used by throttles concurrencymanagerservice ihostedservice responsible for startup init of concurrencymanager snapshot init periodic snapshot storage on primary instance etc cpu throttle memory throttle threadpool starvation throttle snapshot status persistence iconcurrencystatusrepository and blob storage implementation move functions processmonitor into core sdk move functions iprimaryhoststateprovider and related types e g primaryhostcoordinator into core sdk needed for dc status snapshot persistence azure queue extension update queuelistener to use concurrencymanager based on concurrencyoptions service bus extension update servicebuslistener to use concurrencymanager based on concurrencyoptions functions host move required types to webjobs sdk as noted above processmonitor iprimaryhoststateprovider etc implement register oop worker channel throttle provider integrate oop worker management with ihosthealthmonitor register unregister child processes for monitoring integrate throttle results with admin host ping endpoint for scale out implement oop worker process management improvements such that the existing worker configuration isn t required e g process count thread count instead the host should just optimize things dynamically | 0 |
39,833 | 9,670,953,544 | IssuesEvent | 2019-05-21 21:14:51 | bridgedotnet/Bridge | https://api.github.com/repos/bridgedotnet/Bridge | closed | await on an already completed Task can abort async method | area-task defect | When using await on a Task object that is already completed, the javascript code nevertheless calls the function "continueWith" on the Task. This function then in turn immediately calls back to execute the continuation. This causes an ever-deeper nesting of function calls until some overflow happens and the execution terminates.
The correct behavior in this case would be to directly test the Task object for completion and directly continue execution in the original state machine function. This is also the way it is implemented by C#.
### Steps To Reproduce
https://deck.net/2a3086e6e46a00529d010cf7b441ddb4
```csharp
public class Program
{
public static void Main()
{
int[] count = new int[1];
Task t = TightLoop(count);
Console.WriteLine("Iterations: "+count[0]);
}
public static async Task TightLoop(int[] count)
{
Task finished = Task.FromResult<String>(null);
for (int i=0; i<10000; i++)
{ await finished;
count[0]++;
}
}
}
```
### Expected Result
```js
Iterations: 10000
```
### Actual Result
```js
Iterations: 2224
```
| 1.0 | await on an already completed Task can abort async method - When using await on a Task object that is already completed, the javascript code nevertheless calls the function "continueWith" on the Task. This function then in turn immediately calls back to execute the continuation. This causes an ever-deeper nesting of function calls until some overflow happens and the execution terminates.
The correct behavior in this case would be to directly test the Task object for completion and directly continue execution in the original state machine function. This is also the way it is implemented by C#.
### Steps To Reproduce
https://deck.net/2a3086e6e46a00529d010cf7b441ddb4
```csharp
public class Program
{
public static void Main()
{
int[] count = new int[1];
Task t = TightLoop(count);
Console.WriteLine("Iterations: "+count[0]);
}
public static async Task TightLoop(int[] count)
{
Task finished = Task.FromResult<String>(null);
for (int i=0; i<10000; i++)
{ await finished;
count[0]++;
}
}
}
```
### Expected Result
```js
Iterations: 10000
```
### Actual Result
```js
Iterations: 2224
```
| non_priority | await on an already completed task can abort async method when using await on a task object that is already completed the javascript code nevertheless calls the function continuewith on the task this function then in turn immediately calls back to execute the continuation this causes an ever deeper nesting of function calls until some overflow happens and the execution terminates the correct behavior in this case would be to directly test the task object for completion and directly continue execution in the original state machine function this is also the way it is implemented by c steps to reproduce csharp public class program public static void main int count new int task t tightloop count console writeline iterations count public static async task tightloop int count task finished task fromresult null for int i i i await finished count expected result js iterations actual result js iterations | 0 |
470,656 | 13,542,353,983 | IssuesEvent | 2020-09-16 17:12:35 | googleapis/python-storage | https://api.github.com/repos/googleapis/python-storage | closed | Hashes are no longer returned for partial downloads | api: storage needs more info priority: p2 | #204 Changed behavior for hashes for partial downloads.
When a partial request is made, we do not return the 'X-Goog-Hash' headers. As a result,
[blob.py:812](https://github.com/googleapis/python-storage/pull/204/files#diff-5d3277a5f0f072a447a1eb89f9fa1ae0R812) overwrites the `blob.crc32c` value with its default, `None`.
[blob.py:813](https://github.com/googleapis/python-storage/pull/204/files#diff-5d3277a5f0f072a447a1eb89f9fa1ae0R813) overwrites the `blob.md5_hash` value with its default, `None`.
#### Environment details
- OS type and version: Linux 4.19.0-10-cloud-amd64 #1 SMP Debian 4.19.132-1 (2020-07-24) x86_64
- Python version: `python --version`: 3.7.3, 3.8.2
- pip version: `pip --version` 18.1
- `google-cloud-storage` version: `pip show google-cloud-storage` 1.30+
#### Steps to reproduce
1. Get blob
2. issue a partial download (ie `blob.download_to_filename(start=0)`
3. Notice `blob.md5_hash` and `blob.crc32c` are `None`.
#### Code example
```python
client = storage.Client()
bucket = client.get_bucket(BUCKET_ID)
blob = bucket.get_blob(FILE_NAME)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: 3qgj3i0eVt8mnNjtSErqZQ==
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: AeFI7Q==
# MD5
blob.download_to_filename(FILE_NAME)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: 3qgj3i0eVt8mnNjtSErqZQ==
blob.download_to_filename(FILE_NAME, start=0)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: None
# CRC32C
blob.download_to_filename(FILE_NAME)
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: AeFI7Q==
blob.download_to_filename(FILE_NAME, start=0)
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: None
```
| 1.0 | Hashes are no longer returned for partial downloads - #204 Changed behavior for hashes for partial downloads.
When a partial request is made, we do not return the 'X-Goog-Hash' headers. As a result,
[blob.py:812](https://github.com/googleapis/python-storage/pull/204/files#diff-5d3277a5f0f072a447a1eb89f9fa1ae0R812) overwrites the `blob.crc32c` value with its default, `None`.
[blob.py:813](https://github.com/googleapis/python-storage/pull/204/files#diff-5d3277a5f0f072a447a1eb89f9fa1ae0R813) overwrites the `blob.md5_hash` value with its default, `None`.
#### Environment details
- OS type and version: Linux 4.19.0-10-cloud-amd64 #1 SMP Debian 4.19.132-1 (2020-07-24) x86_64
- Python version: `python --version`: 3.7.3, 3.8.2
- pip version: `pip --version` 18.1
- `google-cloud-storage` version: `pip show google-cloud-storage` 1.30+
#### Steps to reproduce
1. Get blob
2. issue a partial download (ie `blob.download_to_filename(start=0)`
3. Notice `blob.md5_hash` and `blob.crc32c` are `None`.
#### Code example
```python
client = storage.Client()
bucket = client.get_bucket(BUCKET_ID)
blob = bucket.get_blob(FILE_NAME)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: 3qgj3i0eVt8mnNjtSErqZQ==
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: AeFI7Q==
# MD5
blob.download_to_filename(FILE_NAME)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: 3qgj3i0eVt8mnNjtSErqZQ==
blob.download_to_filename(FILE_NAME, start=0)
print('blob.md5_hash: {:}'.format(blob.md5_hash)) # blob.md5_hash: None
# CRC32C
blob.download_to_filename(FILE_NAME)
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: AeFI7Q==
blob.download_to_filename(FILE_NAME, start=0)
print('blob.crc32c: {:}'.format(blob.crc32c)) # blob.crc32c: None
```
| priority | hashes are no longer returned for partial downloads changed behavior for hashes for partial downloads when a partial request is made we do not return the x goog hash headers as a result overwrites the blob value with its default none overwrites the blob hash value with its default none environment details os type and version linux cloud smp debian python version python version pip version pip version google cloud storage version pip show google cloud storage steps to reproduce get blob issue a partial download ie blob download to filename start notice blob hash and blob are none code example python client storage client bucket client get bucket bucket id blob bucket get blob file name print blob hash format blob hash blob hash print blob format blob blob blob download to filename file name print blob hash format blob hash blob hash blob download to filename file name start print blob hash format blob hash blob hash none blob download to filename file name print blob format blob blob blob download to filename file name start print blob format blob blob none | 1 |
93,303 | 3,898,884,202 | IssuesEvent | 2016-04-17 11:39:25 | CoderDojo/community-platform | https://api.github.com/repos/CoderDojo/community-platform | closed | Links to Charter & privacy statement broken when logged in | bug high priority | Charter link in nav and privacy statement link in footer seems to break when a user is logged in but works fine when no one is logged in.
1. Log in your account in zen
2. Click privacy statement - will give 404
3. Click charter - will go to https://zen.coderdojo.com/undefined
if you skip step 2, charter works fine even when logged in. | 1.0 | Links to Charter & privacy statement broken when logged in - Charter link in nav and privacy statement link in footer seems to break when a user is logged in but works fine when no one is logged in.
1. Log in your account in zen
2. Click privacy statement - will give 404
3. Click charter - will go to https://zen.coderdojo.com/undefined
if you skip step 2, charter works fine even when logged in. | priority | links to charter privacy statement broken when logged in charter link in nav and privacy statement link in footer seems to break when a user is logged in but works fine when no one is logged in log in your account in zen click privacy statement will give click charter will go to if you skip step charter works fine even when logged in | 1 |
297,833 | 25,766,690,557 | IssuesEvent | 2022-12-09 02:49:57 | PalisadoesFoundation/talawa-api | https://api.github.com/repos/PalisadoesFoundation/talawa-api | closed | Resolvers: Create Tests for getDonationByOrgId | good first issue unapproved parent points 02 test | The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
Tests need to be written for file `lib/resolvers/donation_query/getDonationByOrgId.js`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name `tests/resolvers/donation_query/getDonationByOrgId.sepc.js`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created.
- The PR will show a report for the code coverage for the file you have added. You can use that as a guide. | 1.0 | Resolvers: Create Tests for getDonationByOrgId - The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
Tests need to be written for file `lib/resolvers/donation_query/getDonationByOrgId.js`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name `tests/resolvers/donation_query/getDonationByOrgId.sepc.js`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created.
- The PR will show a report for the code coverage for the file you have added. You can use that as a guide. | non_priority | resolvers create tests for getdonationbyorgid the talawa api code base needs to be reliable this means we need to have test code coverage tests need to be written for file lib resolvers donation query getdonationbyorgid js we will need the api to be refactored for all methods classes and or functions found in this file for testing to be correctly executed when complete all methods classes and or functions in the refactored file will need to be tested these tests must be placed in a single file with the name tests resolvers donation query getdonationbyorgid sepc js you may need to create the appropriate directory structure to do this important please refer to the parent issue on how to implement these tests correctly pr acceptance criteria when complete this file must show coverage when merged into the code base this will be clearly visible when you submit your pr if the file isn t found in this directory or there is a error then tests have not been created the pr will show a report for the code coverage for the file you have added you can use that as a guide | 0 |
47,207 | 19,573,728,087 | IssuesEvent | 2022-01-04 13:09:51 | hashicorp/terraform-provider-aws | https://api.github.com/repos/hashicorp/terraform-provider-aws | closed | resource/aws_vpc_ipam_organization_admin_account | enhancement new-resource service/ec2 | <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
Terraform resource to manage enabling aws orgs for ipam
- [create](https://docs.aws.amazon.com/cli/latest/reference/ec2/enable-ipam-organization-admin-account.html)
- [read](https://docs.aws.amazon.com/cli/latest/reference/organizations/list-delegated-administrators.html) `aws organizations list-delegated-administrators --service-principal ipam.amazonaws.com`
- [delete](https://docs.aws.amazon.com/cli/latest/reference/ec2/disable-ipam-organization-admin-account.html)
<!--- Please leave a helpful description of the feature request here. --->
### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* aws_vpc_ipam_organization_admin_account
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "aws_vpc_ipam_organization_admin_account" "example" {
delegated_account_admin_id = "123456789012"
}
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/
--->
* #0000
| 1.0 | resource/aws_vpc_ipam_organization_admin_account - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
Terraform resource to manage enabling aws orgs for ipam
- [create](https://docs.aws.amazon.com/cli/latest/reference/ec2/enable-ipam-organization-admin-account.html)
- [read](https://docs.aws.amazon.com/cli/latest/reference/organizations/list-delegated-administrators.html) `aws organizations list-delegated-administrators --service-principal ipam.amazonaws.com`
- [delete](https://docs.aws.amazon.com/cli/latest/reference/ec2/disable-ipam-organization-admin-account.html)
<!--- Please leave a helpful description of the feature request here. --->
### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* aws_vpc_ipam_organization_admin_account
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "aws_vpc_ipam_organization_admin_account" "example" {
delegated_account_admin_id = "123456789012"
}
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/
--->
* #0000
| non_priority | resource aws vpc ipam organization admin account community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description terraform resource to manage enabling aws orgs for ipam aws organizations list delegated administrators service principal ipam amazonaws com new or affected resource s aws vpc ipam organization admin account potential terraform configuration hcl resource aws vpc ipam organization admin account example delegated account admin id references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example | 0 |
7,800 | 19,409,914,676 | IssuesEvent | 2021-12-20 08:23:21 | TerriaJS/terriajs | https://api.github.com/repos/TerriaJS/terriajs | closed | MobX - Inconsistent Feature Highlighting | T-Architecture/refactor P-DroughtMap Version 8 | Not yet ported to MobX (TBC):
In some of the features when you select the layer the selected polygon is not being highlighted e.g., if you select Murray Darling Region in the Australian Drainage Division Layer then you can see the feature information pop-ups but the boundary is not been selected; https://uat.drought.gov.au/#share=s-cwf9HLHblkoA7BQdQ4zupzaa70C
**Note this particular layer has been fixed**

| 1.0 | MobX - Inconsistent Feature Highlighting - Not yet ported to MobX (TBC):
In some of the features when you select the layer the selected polygon is not being highlighted e.g., if you select Murray Darling Region in the Australian Drainage Division Layer then you can see the feature information pop-ups but the boundary is not been selected; https://uat.drought.gov.au/#share=s-cwf9HLHblkoA7BQdQ4zupzaa70C
**Note this particular layer has been fixed**

| non_priority | mobx inconsistent feature highlighting not yet ported to mobx tbc in some of the features when you select the layer the selected polygon is not being highlighted e g if you select murray darling region in the australian drainage division layer then you can see the feature information pop ups but the boundary is not been selected note this particular layer has been fixed | 0 |
119,610 | 25,546,149,006 | IssuesEvent | 2022-11-29 18:59:07 | dotnet/interactive | https://api.github.com/repos/dotnet/interactive | closed | [EXTERNAL] Python kernel with C# cell doesn't execute with C# kernel | bug Area-VS Code Extension waiting-on-feedback External Area-VS Code Jupyter Extension Interop | ### Describe the bug
Can't execute C# code into ipynb file in vscode stable. But with jupyter-lab I can:

### Please complete the following:
**Which version of .NET Interactive are you using?** There are a few ways to find this out:
1.0.215204
* In a notebook, run the `#!about` magic command.
Nothing, just say 0.6s


* At the command line, run `dotnet interactive --version`.
1.0.215204+974732778592b69c526648c54ef4c48219058e97
- OS
- [ ] Windows 10
- [ ] macOS
- [x] Linux KDE Neon
- [ ] iOS
- [ ] Android
- Browser
- [ ] Chrome
- [ ] Edge
- [ ] Firefox
- [ ] Safari
- Frontend
- [x] Jupyter Notebook
- [ ] Jupyter Lab
- [ ] nteract
- [ ] Visual Studio Code
- [ ] Other (please specify)
### Screenshots





## Linux Distribution

| 2.0 | [EXTERNAL] Python kernel with C# cell doesn't execute with C# kernel - ### Describe the bug
Can't execute C# code into ipynb file in vscode stable. But with jupyter-lab I can:

### Please complete the following:
**Which version of .NET Interactive are you using?** There are a few ways to find this out:
1.0.215204
* In a notebook, run the `#!about` magic command.
Nothing, just say 0.6s


* At the command line, run `dotnet interactive --version`.
1.0.215204+974732778592b69c526648c54ef4c48219058e97
- OS
- [ ] Windows 10
- [ ] macOS
- [x] Linux KDE Neon
- [ ] iOS
- [ ] Android
- Browser
- [ ] Chrome
- [ ] Edge
- [ ] Firefox
- [ ] Safari
- Frontend
- [x] Jupyter Notebook
- [ ] Jupyter Lab
- [ ] nteract
- [ ] Visual Studio Code
- [ ] Other (please specify)
### Screenshots





## Linux Distribution

| non_priority | python kernel with c cell doesn t execute with c kernel describe the bug can t execute c code into ipynb file in vscode stable but with jupyter lab i can please complete the following which version of net interactive are you using there are a few ways to find this out in a notebook run the about magic command nothing just say at the command line run dotnet interactive version os windows macos linux kde neon ios android browser chrome edge firefox safari frontend jupyter notebook jupyter lab nteract visual studio code other please specify screenshots linux distribution | 0 |
18,793 | 6,643,132,545 | IssuesEvent | 2017-09-27 10:04:48 | magicDGS/ReadTools | https://api.github.com/repos/magicDGS/ReadTools | closed | Pre-Sorted parts handling with DistmapPartDownloader | build/developer enhancement error prone GATK | Because the current implementation of `ReadsDataSource` from GATK assumes that the merged header is always `coordinate` sorted, it does not allow to check for already sorted batches.
If they support to have the merged header with the real sort order from all the files included, it will be nice to speed-up things in the case they are already sorted. | 1.0 | Pre-Sorted parts handling with DistmapPartDownloader - Because the current implementation of `ReadsDataSource` from GATK assumes that the merged header is always `coordinate` sorted, it does not allow to check for already sorted batches.
If they support to have the merged header with the real sort order from all the files included, it will be nice to speed-up things in the case they are already sorted. | non_priority | pre sorted parts handling with distmappartdownloader because the current implementation of readsdatasource from gatk assumes that the merged header is always coordinate sorted it does not allow to check for already sorted batches if they support to have the merged header with the real sort order from all the files included it will be nice to speed up things in the case they are already sorted | 0 |
333,417 | 29,582,992,332 | IssuesEvent | 2023-06-07 07:26:13 | nodejs/jenkins-alerts | https://api.github.com/repos/nodejs/jenkins-alerts | closed | test-rackspace-win2012r2_vs2019-x64-2 has low disk space | potential-incident test-ci | :warning: The machine `test-rackspace-win2012r2_vs2019-x64-2` has low space in Disk (used 91%).
Please refer to the [Jenkins Dashboard](https://ci.nodejs.org/manage/computer/test-rackspace-win2012r2_vs2019-x64-2) to check its status.
_This issue has been auto-generated by [UlisesGascon/jenkins-status-alerts-and-reporting](https://github.com/UlisesGascon/jenkins-status-alerts-and-reporting)._ | 1.0 | test-rackspace-win2012r2_vs2019-x64-2 has low disk space - :warning: The machine `test-rackspace-win2012r2_vs2019-x64-2` has low space in Disk (used 91%).
Please refer to the [Jenkins Dashboard](https://ci.nodejs.org/manage/computer/test-rackspace-win2012r2_vs2019-x64-2) to check its status.
_This issue has been auto-generated by [UlisesGascon/jenkins-status-alerts-and-reporting](https://github.com/UlisesGascon/jenkins-status-alerts-and-reporting)._ | non_priority | test rackspace has low disk space warning the machine test rackspace has low space in disk used please refer to the to check its status this issue has been auto generated by | 0 |
130,266 | 5,113,277,670 | IssuesEvent | 2017-01-06 14:49:53 | duhow/ProfesorOak | https://api.github.com/repos/duhow/ProfesorOak | closed | [IDEA] Módulo Git | enhancement priority-normal requiere otra | Pues al final me lo pensaré para no tener movidas por aquí...
Hacer plugin que haga git pull en el server cuando haga cambios, de esta forma se puede mantener el código actualizado en ambos lados. Igualar cambios necesarios. | 1.0 | [IDEA] Módulo Git - Pues al final me lo pensaré para no tener movidas por aquí...
Hacer plugin que haga git pull en el server cuando haga cambios, de esta forma se puede mantener el código actualizado en ambos lados. Igualar cambios necesarios. | priority | módulo git pues al final me lo pensaré para no tener movidas por aquí hacer plugin que haga git pull en el server cuando haga cambios de esta forma se puede mantener el código actualizado en ambos lados igualar cambios necesarios | 1 |
50,773 | 10,554,639,350 | IssuesEvent | 2019-10-03 19:56:31 | MicrosoftDocs/visualstudio-docs | https://api.github.com/repos/MicrosoftDocs/visualstudio-docs | closed | Visual Studio 2017 v15.9 will generate C26455 instead of C26439 for default constructors | Pri2 area - C++ area - code analysis doc-bug visual-studio-dev15/prod vs-ide-code-analysis/tech | The behavior in Visual Studio 2017 v15.9 has been changed:
https://developercommunity.visualstudio.com/content/problem/236762/c-code-analysis-warning-c26439-this-kind-of-functi.html
C26455 is now generated for default constructors not marked noexcept, which may be suppressed by adding noexcept or an explicit noexcept(false), "... noexcept(false) = default" is now also supported for the constructor.
Note: The link from C26455 to docs.microsoft.com from the Visual Studio error window doesn't work at the moment, as there is no page for C26455.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6aa6301e-fa20-242a-6710-fe4cbf2b51b3
* Version Independent ID: 75e708e3-3356-1c8b-741d-9d6a558e3307
* Content: [C26439 - Visual Studio](https://docs.microsoft.com/en-us/visualstudio/code-quality/c26439?view=vs-2017)
* Content Source: [docs/code-quality/C26439.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/master/docs/code-quality/C26439.md)
* Product: **visual-studio-dev15**
* GitHub Login: @mikeblome
* Microsoft Alias: **mblome** | 2.0 | Visual Studio 2017 v15.9 will generate C26455 instead of C26439 for default constructors - The behavior in Visual Studio 2017 v15.9 has been changed:
https://developercommunity.visualstudio.com/content/problem/236762/c-code-analysis-warning-c26439-this-kind-of-functi.html
C26455 is now generated for default constructors not marked noexcept, which may be suppressed by adding noexcept or an explicit noexcept(false), "... noexcept(false) = default" is now also supported for the constructor.
Note: The link from C26455 to docs.microsoft.com from the Visual Studio error window doesn't work at the moment, as there is no page for C26455.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6aa6301e-fa20-242a-6710-fe4cbf2b51b3
* Version Independent ID: 75e708e3-3356-1c8b-741d-9d6a558e3307
* Content: [C26439 - Visual Studio](https://docs.microsoft.com/en-us/visualstudio/code-quality/c26439?view=vs-2017)
* Content Source: [docs/code-quality/C26439.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/master/docs/code-quality/C26439.md)
* Product: **visual-studio-dev15**
* GitHub Login: @mikeblome
* Microsoft Alias: **mblome** | non_priority | visual studio will generate instead of for default constructors the behavior in visual studio has been changed is now generated for default constructors not marked noexcept which may be suppressed by adding noexcept or an explicit noexcept false noexcept false default is now also supported for the constructor note the link from to docs microsoft com from the visual studio error window doesn t work at the moment as there is no page for document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product visual studio github login mikeblome microsoft alias mblome | 0 |
83,707 | 24,125,725,894 | IssuesEvent | 2022-09-21 00:02:28 | xamarin/xamarin-android | https://api.github.com/repos/xamarin/xamarin-android | closed | The "GenerateResourceDesigner" task failed unexpectedly with NullReferenceException | Area: App+Library Build possibly-stale | ### Steps to Reproduce
1. Create a Xamarin.Forms project that references a custom .NETStandard library and an SDK extras project
2. Try to build the Xamarin.Forms Android platform project
3. Android build fails with the above error message.
<!--
If you have a repro project, you may drag & drop the .zip/etc. onto the issue editor to attach it.
-->
### Expected Behavior
Build XF Android project without errors
### Actual Behavior
XF Android fails to build with the above error
### Version Information
Reproduced on both latest VS for Windows and VS for Windows Preview (16.3.10 and 16.4.0 Preview 6 respectively).
<!--
1. On macOS and within Visual Studio, select Visual Studio > About Visual Studio, then click the Show Details button, then click the Copy Information button.
2. Paste below this comment block.
-->
Microsoft Visual Studio Enterprise 2019
Version 16.3.10
VisualStudio.16.Release/16.3.10+29519.87
Microsoft .NET Framework
Version 4.8.03752
Installed Version: Enterprise
Visual C++ 2019 00435-60000-00000-AA598
Microsoft Visual C++ 2019
ADL Tools Service Provider 1.0
This package contains services used by Data Lake tools
Application Insights Tools for Visual Studio Package 9.1.00913.1
Application Insights Tools for Visual Studio
ASP.NET and Web Tools 2019 16.3.286.43615
ASP.NET and Web Tools 2019
ASP.NET Web Frameworks and Tools 2019 16.3.286.43615
For additional information, visit https://www.asp.net/
Azure App Service Tools v3.0.0 16.3.286.43615
Azure App Service Tools v3.0.0
Azure Data Lake Node 1.0
This package contains the Data Lake integration nodes for Server Explorer.
Azure Data Lake Tools for Visual Studio 2.4.1000.0
Microsoft Azure Data Lake Tools for Visual Studio
Azure Functions and Web Jobs Tools 16.3.286.43615
Azure Functions and Web Jobs Tools
Azure Stream Analytics Tools for Visual Studio 2.4.1000.0
Microsoft Azure Stream Analytics Tools for Visual Studio
C# Tools 3.3.1-beta3-19461-02+2fd12c210e22f7d6245805c60340f6a34af6875b
C# components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Common Azure Tools 1.10
Provides common services for use by Azure Mobile Services and Microsoft Azure Tools.
Extensibility Message Bus 1.2.0 (d16-2@8b56e20)
Provides common messaging-based MEF services for loosely coupled Visual Studio extension components communication and integration.
Fabric.DiagnosticEvents 1.0
Fabric Diagnostic Events
GitHub.VisualStudio 2.10.8.8132
A Visual Studio Extension that brings the GitHub Flow into Visual Studio.
IntelliCode Extension 1.0
IntelliCode Visual Studio Extension Detailed Info
Microsoft Azure HDInsight Azure Node 2.4.1000.0
HDInsight Node under Azure Node
Microsoft Azure Hive Query Language Service 2.4.1000.0
Language service for Hive query
Microsoft Azure Service Fabric Tools for Visual Studio 16.0
Microsoft Azure Service Fabric Tools for Visual Studio
Microsoft Azure Stream Analytics Language Service 2.4.1000.0
Language service for Azure Stream Analytics
Microsoft Azure Stream Analytics Node 1.0
Azure Stream Analytics Node under Azure Node
Microsoft Azure Tools 2.9
Microsoft Azure Tools for Microsoft Visual Studio 0x10 - v2.9.20816.1
Microsoft Continuous Delivery Tools for Visual Studio 0.4
Simplifying the configuration of Azure DevOps pipelines from within the Visual Studio IDE.
Microsoft JVM Debugger 1.0
Provides support for connecting the Visual Studio debugger to JDWP compatible Java Virtual Machines
Microsoft Library Manager 2.0.83+gbc8a4b23ec
Install client-side libraries easily to any web project
Microsoft MI-Based Debugger 1.0
Provides support for connecting Visual Studio to MI compatible debuggers
Microsoft Visual C++ Wizards 1.0
Microsoft Visual C++ Wizards
Microsoft Visual Studio Tools for Containers 1.1
Develop, run, validate your ASP.NET Core applications in the target environment. F5 your application directly into a container with debugging, or CTRL + F5 to edit & refresh your app without having to rebuild the container.
Microsoft Visual Studio VC Package 1.0
Microsoft Visual Studio VC Package
Mono Debugging for Visual Studio 16.3.7 (9d260c5)
Support for debugging Mono processes with Visual Studio.
NuGet Package Manager 5.3.1
NuGet Package Manager in Visual Studio. For more information about NuGet, visit https://docs.nuget.org/
ProjectServicesPackage Extension 1.0
ProjectServicesPackage Visual Studio Extension Detailed Info
Snapshot Debugging Extension 1.0
Snapshot Debugging Visual Studio Extension Detailed Info
SQL Server Data Tools 16.0.61908.27190
Microsoft SQL Server Data Tools
ToolWindowHostedEditor 1.0
Hosting json editor into a tool window
TypeScript Tools 16.0.10821.2002
TypeScript Tools for Microsoft Visual Studio
Visual Basic Tools 3.3.1-beta3-19461-02+2fd12c210e22f7d6245805c60340f6a34af6875b
Visual Basic components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Visual F# Tools 10.4 for F# 4.6 16.3.0-beta.19455.1+0422ff293bb2cc722fe5021b85ef50378a9af823
Microsoft Visual F# Tools 10.4 for F# 4.6
Visual Studio Code Debug Adapter Host Package 1.0
Interop layer for hosting Visual Studio Code debug adapters in Visual Studio
Visual Studio Tools for Containers 1.0
Visual Studio Tools for Containers
Visual Studio Tools for Kubernetes 1.0
Visual Studio Tools for Kubernetes
VisualStudio.Mac 1.0
Mac Extension for Visual Studio
WiX Toolset Visual Studio Extension 1.0.0.4
WiX Toolset Visual Studio Extension version 1.0.0.4
Copyright (c) .NET Foundation and contributors. All rights reserved.
Xamarin 16.3.0.281 (d16-3@859f726)
Visual Studio extension to enable development for Xamarin.iOS and Xamarin.Android.
Xamarin Designer 16.3.0.256 (remotes/origin/d16-3@8a223bfd7)
Visual Studio extension to enable Xamarin Designer tools in Visual Studio.
Xamarin Templates 16.3.565 (27e9746)
Templates for building iOS, Android, and Windows apps with Xamarin and Xamarin.Forms.
Xamarin.Android SDK 10.0.6.2 (d16-3/c407838)
Xamarin.Android Reference Assemblies and MSBuild support.
Mono: mono/mono/2019-06@476d72b9e32
Java.Interop: xamarin/java.interop/d16-3@5836f58
LibZipSharp: grendello/LibZipSharp/d16-3@71f4a94
LibZip: nih-at/libzip/rel-1-5-1@b95cf3fd
ProGuard: xamarin/proguard/master@905836d
SQLite: xamarin/sqlite/3.27.1@8212a2d
Xamarin.Android Tools: xamarin/xamarin-android-tools/d16-3@cb41333
Xamarin.iOS and Xamarin.Mac SDK 13.6.0.12 (e3c2b40)
Xamarin.iOS and Xamarin.Mac Reference Assemblies and MSBuild support.
### Log File
<!--
1. On macOS and within Visual Studio:
a. Click **Tools**
[Output-Build.txt](https://github.com/xamarin/xamarin-android/files/3889395/Output-Build.txt)
> **SDK Command Prompt**.
b. Within the launched `Terminal.app` window, run:
adb logcat -d | pbcopy
2. Paste below this comment block
-->
<!--
Switch to the "Preview" tab to ensure your issue renders correctly.
-->
| 1.0 | The "GenerateResourceDesigner" task failed unexpectedly with NullReferenceException - ### Steps to Reproduce
1. Create a Xamarin.Forms project that references a custom .NETStandard library and an SDK extras project
2. Try to build the Xamarin.Forms Android platform project
3. Android build fails with the above error message.
<!--
If you have a repro project, you may drag & drop the .zip/etc. onto the issue editor to attach it.
-->
### Expected Behavior
Build XF Android project without errors
### Actual Behavior
XF Android fails to build with the above error
### Version Information
Reproduced on both latest VS for Windows and VS for Windows Preview (16.3.10 and 16.4.0 Preview 6 respectively).
<!--
1. On macOS and within Visual Studio, select Visual Studio > About Visual Studio, then click the Show Details button, then click the Copy Information button.
2. Paste below this comment block.
-->
Microsoft Visual Studio Enterprise 2019
Version 16.3.10
VisualStudio.16.Release/16.3.10+29519.87
Microsoft .NET Framework
Version 4.8.03752
Installed Version: Enterprise
Visual C++ 2019 00435-60000-00000-AA598
Microsoft Visual C++ 2019
ADL Tools Service Provider 1.0
This package contains services used by Data Lake tools
Application Insights Tools for Visual Studio Package 9.1.00913.1
Application Insights Tools for Visual Studio
ASP.NET and Web Tools 2019 16.3.286.43615
ASP.NET and Web Tools 2019
ASP.NET Web Frameworks and Tools 2019 16.3.286.43615
For additional information, visit https://www.asp.net/
Azure App Service Tools v3.0.0 16.3.286.43615
Azure App Service Tools v3.0.0
Azure Data Lake Node 1.0
This package contains the Data Lake integration nodes for Server Explorer.
Azure Data Lake Tools for Visual Studio 2.4.1000.0
Microsoft Azure Data Lake Tools for Visual Studio
Azure Functions and Web Jobs Tools 16.3.286.43615
Azure Functions and Web Jobs Tools
Azure Stream Analytics Tools for Visual Studio 2.4.1000.0
Microsoft Azure Stream Analytics Tools for Visual Studio
C# Tools 3.3.1-beta3-19461-02+2fd12c210e22f7d6245805c60340f6a34af6875b
C# components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Common Azure Tools 1.10
Provides common services for use by Azure Mobile Services and Microsoft Azure Tools.
Extensibility Message Bus 1.2.0 (d16-2@8b56e20)
Provides common messaging-based MEF services for loosely coupled Visual Studio extension components communication and integration.
Fabric.DiagnosticEvents 1.0
Fabric Diagnostic Events
GitHub.VisualStudio 2.10.8.8132
A Visual Studio Extension that brings the GitHub Flow into Visual Studio.
IntelliCode Extension 1.0
IntelliCode Visual Studio Extension Detailed Info
Microsoft Azure HDInsight Azure Node 2.4.1000.0
HDInsight Node under Azure Node
Microsoft Azure Hive Query Language Service 2.4.1000.0
Language service for Hive query
Microsoft Azure Service Fabric Tools for Visual Studio 16.0
Microsoft Azure Service Fabric Tools for Visual Studio
Microsoft Azure Stream Analytics Language Service 2.4.1000.0
Language service for Azure Stream Analytics
Microsoft Azure Stream Analytics Node 1.0
Azure Stream Analytics Node under Azure Node
Microsoft Azure Tools 2.9
Microsoft Azure Tools for Microsoft Visual Studio 0x10 - v2.9.20816.1
Microsoft Continuous Delivery Tools for Visual Studio 0.4
Simplifying the configuration of Azure DevOps pipelines from within the Visual Studio IDE.
Microsoft JVM Debugger 1.0
Provides support for connecting the Visual Studio debugger to JDWP compatible Java Virtual Machines
Microsoft Library Manager 2.0.83+gbc8a4b23ec
Install client-side libraries easily to any web project
Microsoft MI-Based Debugger 1.0
Provides support for connecting Visual Studio to MI compatible debuggers
Microsoft Visual C++ Wizards 1.0
Microsoft Visual C++ Wizards
Microsoft Visual Studio Tools for Containers 1.1
Develop, run, validate your ASP.NET Core applications in the target environment. F5 your application directly into a container with debugging, or CTRL + F5 to edit & refresh your app without having to rebuild the container.
Microsoft Visual Studio VC Package 1.0
Microsoft Visual Studio VC Package
Mono Debugging for Visual Studio 16.3.7 (9d260c5)
Support for debugging Mono processes with Visual Studio.
NuGet Package Manager 5.3.1
NuGet Package Manager in Visual Studio. For more information about NuGet, visit https://docs.nuget.org/
ProjectServicesPackage Extension 1.0
ProjectServicesPackage Visual Studio Extension Detailed Info
Snapshot Debugging Extension 1.0
Snapshot Debugging Visual Studio Extension Detailed Info
SQL Server Data Tools 16.0.61908.27190
Microsoft SQL Server Data Tools
ToolWindowHostedEditor 1.0
Hosting json editor into a tool window
TypeScript Tools 16.0.10821.2002
TypeScript Tools for Microsoft Visual Studio
Visual Basic Tools 3.3.1-beta3-19461-02+2fd12c210e22f7d6245805c60340f6a34af6875b
Visual Basic components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Visual F# Tools 10.4 for F# 4.6 16.3.0-beta.19455.1+0422ff293bb2cc722fe5021b85ef50378a9af823
Microsoft Visual F# Tools 10.4 for F# 4.6
Visual Studio Code Debug Adapter Host Package 1.0
Interop layer for hosting Visual Studio Code debug adapters in Visual Studio
Visual Studio Tools for Containers 1.0
Visual Studio Tools for Containers
Visual Studio Tools for Kubernetes 1.0
Visual Studio Tools for Kubernetes
VisualStudio.Mac 1.0
Mac Extension for Visual Studio
WiX Toolset Visual Studio Extension 1.0.0.4
WiX Toolset Visual Studio Extension version 1.0.0.4
Copyright (c) .NET Foundation and contributors. All rights reserved.
Xamarin 16.3.0.281 (d16-3@859f726)
Visual Studio extension to enable development for Xamarin.iOS and Xamarin.Android.
Xamarin Designer 16.3.0.256 (remotes/origin/d16-3@8a223bfd7)
Visual Studio extension to enable Xamarin Designer tools in Visual Studio.
Xamarin Templates 16.3.565 (27e9746)
Templates for building iOS, Android, and Windows apps with Xamarin and Xamarin.Forms.
Xamarin.Android SDK 10.0.6.2 (d16-3/c407838)
Xamarin.Android Reference Assemblies and MSBuild support.
Mono: mono/mono/2019-06@476d72b9e32
Java.Interop: xamarin/java.interop/d16-3@5836f58
LibZipSharp: grendello/LibZipSharp/d16-3@71f4a94
LibZip: nih-at/libzip/rel-1-5-1@b95cf3fd
ProGuard: xamarin/proguard/master@905836d
SQLite: xamarin/sqlite/3.27.1@8212a2d
Xamarin.Android Tools: xamarin/xamarin-android-tools/d16-3@cb41333
Xamarin.iOS and Xamarin.Mac SDK 13.6.0.12 (e3c2b40)
Xamarin.iOS and Xamarin.Mac Reference Assemblies and MSBuild support.
### Log File
<!--
1. On macOS and within Visual Studio:
a. Click **Tools**
[Output-Build.txt](https://github.com/xamarin/xamarin-android/files/3889395/Output-Build.txt)
> **SDK Command Prompt**.
b. Within the launched `Terminal.app` window, run:
adb logcat -d | pbcopy
2. Paste below this comment block
-->
<!--
Switch to the "Preview" tab to ensure your issue renders correctly.
-->
| non_priority | the generateresourcedesigner task failed unexpectedly with nullreferenceexception steps to reproduce create a xamarin forms project that references a custom netstandard library and an sdk extras project try to build the xamarin forms android platform project android build fails with the above error message if you have a repro project you may drag drop the zip etc onto the issue editor to attach it expected behavior build xf android project without errors actual behavior xf android fails to build with the above error version information reproduced on both latest vs for windows and vs for windows preview and preview respectively on macos and within visual studio select visual studio about visual studio then click the show details button then click the copy information button paste below this comment block microsoft visual studio enterprise version visualstudio release microsoft net framework version installed version enterprise visual c microsoft visual c adl tools service provider this package contains services used by data lake tools application insights tools for visual studio package application insights tools for visual studio asp net and web tools asp net and web tools asp net web frameworks and tools for additional information visit azure app service tools azure app service tools azure data lake node this package contains the data lake integration nodes for server explorer azure data lake tools for visual studio microsoft azure data lake tools for visual studio azure functions and web jobs tools azure functions and web jobs tools azure stream analytics tools for visual studio microsoft azure stream analytics tools for visual studio c tools c components used in the ide depending on your project type and settings a different version of the compiler may be used common azure tools provides common services for use by azure mobile services and microsoft azure tools extensibility message bus provides common messaging based mef services for loosely coupled visual studio extension components communication and integration fabric diagnosticevents fabric diagnostic events github visualstudio a visual studio extension that brings the github flow into visual studio intellicode extension intellicode visual studio extension detailed info microsoft azure hdinsight azure node hdinsight node under azure node microsoft azure hive query language service language service for hive query microsoft azure service fabric tools for visual studio microsoft azure service fabric tools for visual studio microsoft azure stream analytics language service language service for azure stream analytics microsoft azure stream analytics node azure stream analytics node under azure node microsoft azure tools microsoft azure tools for microsoft visual studio microsoft continuous delivery tools for visual studio simplifying the configuration of azure devops pipelines from within the visual studio ide microsoft jvm debugger provides support for connecting the visual studio debugger to jdwp compatible java virtual machines microsoft library manager install client side libraries easily to any web project microsoft mi based debugger provides support for connecting visual studio to mi compatible debuggers microsoft visual c wizards microsoft visual c wizards microsoft visual studio tools for containers develop run validate your asp net core applications in the target environment your application directly into a container with debugging or ctrl to edit refresh your app without having to rebuild the container microsoft visual studio vc package microsoft visual studio vc package mono debugging for visual studio support for debugging mono processes with visual studio nuget package manager nuget package manager in visual studio for more information about nuget visit projectservicespackage extension projectservicespackage visual studio extension detailed info snapshot debugging extension snapshot debugging visual studio extension detailed info sql server data tools microsoft sql server data tools toolwindowhostededitor hosting json editor into a tool window typescript tools typescript tools for microsoft visual studio visual basic tools visual basic components used in the ide depending on your project type and settings a different version of the compiler may be used visual f tools for f beta microsoft visual f tools for f visual studio code debug adapter host package interop layer for hosting visual studio code debug adapters in visual studio visual studio tools for containers visual studio tools for containers visual studio tools for kubernetes visual studio tools for kubernetes visualstudio mac mac extension for visual studio wix toolset visual studio extension wix toolset visual studio extension version copyright c net foundation and contributors all rights reserved xamarin visual studio extension to enable development for xamarin ios and xamarin android xamarin designer remotes origin visual studio extension to enable xamarin designer tools in visual studio xamarin templates templates for building ios android and windows apps with xamarin and xamarin forms xamarin android sdk xamarin android reference assemblies and msbuild support mono mono mono java interop xamarin java interop libzipsharp grendello libzipsharp libzip nih at libzip rel proguard xamarin proguard master sqlite xamarin sqlite xamarin android tools xamarin xamarin android tools xamarin ios and xamarin mac sdk xamarin ios and xamarin mac reference assemblies and msbuild support log file on macos and within visual studio a click tools sdk command prompt b within the launched terminal app window run adb logcat d pbcopy paste below this comment block switch to the preview tab to ensure your issue renders correctly | 0 |
613,913 | 19,101,534,315 | IssuesEvent | 2021-11-29 23:21:34 | CMPUT301F21T21/detes | https://api.github.com/repos/CMPUT301F21T21/detes | closed | US 06.01.02 Geolocation and Maps | Final checkpoint High Risk Medium Priority New | As a doer, I want the location for a habit event to be specified using a map within the app, with the current phone position as the default location.
**Clarification:** The user would like the ability to specify a location of habit by using the app once the habit has been completed. This will all be accomplished in the app.
Story Points: 5 | 1.0 | US 06.01.02 Geolocation and Maps - As a doer, I want the location for a habit event to be specified using a map within the app, with the current phone position as the default location.
**Clarification:** The user would like the ability to specify a location of habit by using the app once the habit has been completed. This will all be accomplished in the app.
Story Points: 5 | priority | us geolocation and maps as a doer i want the location for a habit event to be specified using a map within the app with the current phone position as the default location clarification the user would like the ability to specify a location of habit by using the app once the habit has been completed this will all be accomplished in the app story points | 1 |
135,879 | 5,266,289,552 | IssuesEvent | 2017-02-04 10:46:23 | japanesemediamanager/ShokoServer | https://api.github.com/repos/japanesemediamanager/ShokoServer | closed | multiple db entries linking to same file | Bug - Low Priority Most Likely Fixed - Need Confirmation | I have file renaming, but it's failing sometimes.
1.Is there a way to avoid creating multiple database entries which link to same file? Check before overwriting a file?
2.Where does CRC come from? Anidb? Can it be calculated locally when missing?
JMM Desktop and Server 3.6.1.0

| 1.0 | multiple db entries linking to same file - I have file renaming, but it's failing sometimes.
1.Is there a way to avoid creating multiple database entries which link to same file? Check before overwriting a file?
2.Where does CRC come from? Anidb? Can it be calculated locally when missing?
JMM Desktop and Server 3.6.1.0

| priority | multiple db entries linking to same file i have file renaming but it s failing sometimes is there a way to avoid creating multiple database entries which link to same file check before overwriting a file where does crc come from anidb can it be calculated locally when missing jmm desktop and server | 1 |
75,243 | 7,466,758,982 | IssuesEvent | 2018-04-02 12:27:58 | wso2/testgrid | https://api.github.com/repos/wso2/testgrid | opened | [Dashboard] Create an admin portal to add new infrastructure parameters into the system. | Priority/Normal Severity/Minor Testgrid/Dashboard Type/New Feature | **Description:**
This is a tentative task.
It's good have a admin portal to add/remove/update infrastructure_parameter table. Right now, we have to manually go and populate the database. It's a bit cumbersome.
<!-- Give a brief description of the issue -->
**Affected Product Version:**
0.9.0-m19
**OS, DB, other environment details and versions:**
**Steps to reproduce:**
N/A
**Related Issues:**
#283
<!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> | 1.0 | [Dashboard] Create an admin portal to add new infrastructure parameters into the system. - **Description:**
This is a tentative task.
It's good have a admin portal to add/remove/update infrastructure_parameter table. Right now, we have to manually go and populate the database. It's a bit cumbersome.
<!-- Give a brief description of the issue -->
**Affected Product Version:**
0.9.0-m19
**OS, DB, other environment details and versions:**
**Steps to reproduce:**
N/A
**Related Issues:**
#283
<!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> | non_priority | create an admin portal to add new infrastructure parameters into the system description this is a tentative task it s good have a admin portal to add remove update infrastructure parameter table right now we have to manually go and populate the database it s a bit cumbersome affected product version os db other environment details and versions steps to reproduce n a related issues | 0 |
758,219 | 26,546,725,261 | IssuesEvent | 2023-01-20 01:18:49 | zulip/zulip | https://api.github.com/repos/zulip/zulip | opened | Improve "Complete the organization profile" banner | help wanted area: onboarding priority: high | At present, the owner of a new Zulip organization is immediately shown a banner at the top of the app that says:
> Complete the organization profile to brand and explain the purpose of this Zulip organization.
As [discussed on CZO](https://chat.zulip.org/#narrow/stream/101-design/topic/organization.20profile.20banner/near/1489812), we should:
- [ ] Change the banner text to say the following, which better explains what the profile is for:
> Complete [your organization's profile](#organization/organization-profile), which is displayed on your organization's registration and login pages.
- [ ] Delay showing the banner until 15 days after the organization is created.
| 1.0 | Improve "Complete the organization profile" banner - At present, the owner of a new Zulip organization is immediately shown a banner at the top of the app that says:
> Complete the organization profile to brand and explain the purpose of this Zulip organization.
As [discussed on CZO](https://chat.zulip.org/#narrow/stream/101-design/topic/organization.20profile.20banner/near/1489812), we should:
- [ ] Change the banner text to say the following, which better explains what the profile is for:
> Complete [your organization's profile](#organization/organization-profile), which is displayed on your organization's registration and login pages.
- [ ] Delay showing the banner until 15 days after the organization is created.
| priority | improve complete the organization profile banner at present the owner of a new zulip organization is immediately shown a banner at the top of the app that says complete the organization profile to brand and explain the purpose of this zulip organization as we should change the banner text to say the following which better explains what the profile is for complete organization organization profile which is displayed on your organization s registration and login pages delay showing the banner until days after the organization is created | 1 |
42,318 | 9,203,106,565 | IssuesEvent | 2019-03-08 00:58:38 | Microsoft/vscode-python | https://api.github.com/repos/Microsoft/vscode-python | opened | Refactor tests in pytest.run.test.ts | P1 feature-testing type-code health | Currently the tests are just not manageable, nor readable.
`src/test/unittests/pytest/pytest.run.test.ts` | 1.0 | Refactor tests in pytest.run.test.ts - Currently the tests are just not manageable, nor readable.
`src/test/unittests/pytest/pytest.run.test.ts` | non_priority | refactor tests in pytest run test ts currently the tests are just not manageable nor readable src test unittests pytest pytest run test ts | 0 |
121,188 | 17,646,041,248 | IssuesEvent | 2021-08-20 06:15:08 | MohamedElashri/Zotero-Docker | https://api.github.com/repos/MohamedElashri/Zotero-Docker | opened | CVE-2021-23413 (High) detected in jszip-2.4.0.tgz | security vulnerability | ## CVE-2021-23413 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jszip-2.4.0.tgz</b></p></summary>
<p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p>
<p>Library home page: <a href="https://registry.npmjs.org/jszip/-/jszip-2.4.0.tgz">https://registry.npmjs.org/jszip/-/jszip-2.4.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/jszip/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **jszip-2.4.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/2f0b543b4f60571cb367b6f3ef1b05298e349c06">2f0b543b4f60571cb367b6f3ef1b05298e349c06</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jszip before 3.7.0. Crafting a new zip file with filenames set to Object prototype values (e.g __proto__, toString, etc) results in a returned object with a modified prototype instance.
<p>Publish Date: 2021-07-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23413>CVE-2021-23413</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23413">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23413</a></p>
<p>Release Date: 2021-07-25</p>
<p>Fix Resolution: jszip - 3.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-23413 (High) detected in jszip-2.4.0.tgz - ## CVE-2021-23413 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jszip-2.4.0.tgz</b></p></summary>
<p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p>
<p>Library home page: <a href="https://registry.npmjs.org/jszip/-/jszip-2.4.0.tgz">https://registry.npmjs.org/jszip/-/jszip-2.4.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/jszip/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **jszip-2.4.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/2f0b543b4f60571cb367b6f3ef1b05298e349c06">2f0b543b4f60571cb367b6f3ef1b05298e349c06</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jszip before 3.7.0. Crafting a new zip file with filenames set to Object prototype values (e.g __proto__, toString, etc) results in a returned object with a modified prototype instance.
<p>Publish Date: 2021-07-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23413>CVE-2021-23413</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23413">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23413</a></p>
<p>Release Date: 2021-07-25</p>
<p>Fix Resolution: jszip - 3.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in jszip tgz cve high severity vulnerability vulnerable library jszip tgz create read and edit zip files with javascript library home page a href path to dependency file zotero docker client zotero build xpi package json path to vulnerable library zotero docker client zotero build xpi node modules jszip package json dependency hierarchy jpm tgz root library x jszip tgz vulnerable library found in head commit a href found in base branch main vulnerability details this affects the package jszip before crafting a new zip file with filenames set to object prototype values e g proto tostring etc results in a returned object with a modified prototype instance publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jszip step up your open source security game with whitesource | 0 |
186,948 | 14,426,868,438 | IssuesEvent | 2020-12-06 00:28:37 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | zhengqiangzheng/vscode_code: go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go; 10 LoC | fresh test tiny |
Found a possible issue in [zhengqiangzheng/vscode_code](https://www.github.com/zhengqiangzheng/vscode_code) at [go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go](https://github.com/zhengqiangzheng/vscode_code/blob/d60b6126f7e6112ee54647b3aee81690b0c0f985/go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go#L181-L190)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to test at line 189 may start a goroutine
[Click here to see the code in its original context.](https://github.com/zhengqiangzheng/vscode_code/blob/d60b6126f7e6112ee54647b3aee81690b0c0f985/go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go#L181-L190)
<details>
<summary>Click here to show the 10 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range [...]importerTest{
{pkgpath: "io", name: "Reader", want: "type Reader interface{Read(p []uint8) (n int, err error)}"},
{pkgpath: "io", name: "ReadWriter", want: "type ReadWriter interface{Reader; Writer}"},
{pkgpath: "math", name: "Pi", want: "const Pi untyped float"},
{pkgpath: "math", name: "Sin", want: "func Sin(x float64) float64"},
{pkgpath: "sort", name: "Ints", want: "func Ints(a []int)"},
{pkgpath: "unsafe", name: "Pointer", want: "type Pointer unsafe.Pointer"},
} {
runImporterTest(t, imp, nil, &test)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: d60b6126f7e6112ee54647b3aee81690b0c0f985
| 1.0 | zhengqiangzheng/vscode_code: go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go; 10 LoC -
Found a possible issue in [zhengqiangzheng/vscode_code](https://www.github.com/zhengqiangzheng/vscode_code) at [go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go](https://github.com/zhengqiangzheng/vscode_code/blob/d60b6126f7e6112ee54647b3aee81690b0c0f985/go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go#L181-L190)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to test at line 189 may start a goroutine
[Click here to see the code in its original context.](https://github.com/zhengqiangzheng/vscode_code/blob/d60b6126f7e6112ee54647b3aee81690b0c0f985/go/pkg/mod/golang.org/x/tools@v0.0.0-20181130195746-895048a75ecf/go/internal/gccgoimporter/gccgoinstallation_test.go#L181-L190)
<details>
<summary>Click here to show the 10 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range [...]importerTest{
{pkgpath: "io", name: "Reader", want: "type Reader interface{Read(p []uint8) (n int, err error)}"},
{pkgpath: "io", name: "ReadWriter", want: "type ReadWriter interface{Reader; Writer}"},
{pkgpath: "math", name: "Pi", want: "const Pi untyped float"},
{pkgpath: "math", name: "Sin", want: "func Sin(x float64) float64"},
{pkgpath: "sort", name: "Ints", want: "func Ints(a []int)"},
{pkgpath: "unsafe", name: "Pointer", want: "type Pointer unsafe.Pointer"},
} {
runImporterTest(t, imp, nil, &test)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: d60b6126f7e6112ee54647b3aee81690b0c0f985
| non_priority | zhengqiangzheng vscode code go pkg mod golang org x tools go internal gccgoimporter gccgoinstallation test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to test at line may start a goroutine click here to show the line s of go which triggered the analyzer go for test range importertest pkgpath io name reader want type reader interface read p n int err error pkgpath io name readwriter want type readwriter interface reader writer pkgpath math name pi want const pi untyped float pkgpath math name sin want func sin x pkgpath sort name ints want func ints a int pkgpath unsafe name pointer want type pointer unsafe pointer runimportertest t imp nil test leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
357,294 | 25,176,361,479 | IssuesEvent | 2022-11-11 09:36:49 | LokQiJun/pe | https://api.github.com/repos/LokQiJun/pe | opened | Wrong formatting of variables on page 13 of DG | type.DocumentationBug severity.VeryLow | Format should be variable_name:variable_type

<!--session: 1668153579868-cea4df5c-4c6d-4f1d-ae65-e376ecbc9cc3-->
<!--Version: Web v3.4.4--> | 1.0 | Wrong formatting of variables on page 13 of DG - Format should be variable_name:variable_type

<!--session: 1668153579868-cea4df5c-4c6d-4f1d-ae65-e376ecbc9cc3-->
<!--Version: Web v3.4.4--> | non_priority | wrong formatting of variables on page of dg format should be variable name variable type | 0 |
157,013 | 19,911,268,247 | IssuesEvent | 2022-01-25 17:23:36 | ghc-dev/Michael-Jones | https://api.github.com/repos/ghc-dev/Michael-Jones | closed | CVE-2017-16138 (High) detected in mime-1.3.4.tgz - autoclosed | security vulnerability | ## CVE-2017-16138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mime-1.3.4.tgz</b></p></summary>
<p>A comprehensive library for mime-type mapping</p>
<p>Library home page: <a href="https://registry.npmjs.org/mime/-/mime-1.3.4.tgz">https://registry.npmjs.org/mime/-/mime-1.3.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mime/package.json</p>
<p>
Dependency Hierarchy:
- :x: **mime-1.3.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Michael-Jones/commit/d10019c82a0425af26828897a076bc3d939d6ee4">d10019c82a0425af26828897a076bc3d939d6ee4</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The mime module < 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16138>CVE-2017-16138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 1.4.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mime","packageVersion":"1.3.4","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mime:1.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-16138","vulnerabilityDetails":"The mime module \u003c 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16138","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2017-16138 (High) detected in mime-1.3.4.tgz - autoclosed - ## CVE-2017-16138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mime-1.3.4.tgz</b></p></summary>
<p>A comprehensive library for mime-type mapping</p>
<p>Library home page: <a href="https://registry.npmjs.org/mime/-/mime-1.3.4.tgz">https://registry.npmjs.org/mime/-/mime-1.3.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mime/package.json</p>
<p>
Dependency Hierarchy:
- :x: **mime-1.3.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Michael-Jones/commit/d10019c82a0425af26828897a076bc3d939d6ee4">d10019c82a0425af26828897a076bc3d939d6ee4</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The mime module < 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16138>CVE-2017-16138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 1.4.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mime","packageVersion":"1.3.4","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"mime:1.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-16138","vulnerabilityDetails":"The mime module \u003c 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16138","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in mime tgz autoclosed cve high severity vulnerability vulnerable library mime tgz a comprehensive library for mime type mapping library home page a href path to dependency file package json path to vulnerable library node modules mime package json dependency hierarchy x mime tgz vulnerable library found in head commit a href found in base branch main vulnerability details the mime module is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree mime isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the mime module is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input vulnerabilityurl | 0 |
608,708 | 18,846,601,704 | IssuesEvent | 2021-11-11 15:35:40 | metabase/metabase | https://api.github.com/repos/metabase/metabase | closed | Dashboard filters overflow the screen making it not possible to use them on IE11 | Type:Bug .Won't Fix Priority:P2 Reporting/Dashboards Browser:IE Querying/Parameters & Variables | **Describe the bug**
Dashboard filters overflow the screen making it not possible to use them on IE11 - there's no horizontal scrollbar available.
Temporary workaround: Use the page zoom to zoom out (`CTRL`+`-`), set the filters and then reset the zoom level (`CTRL`+`0`)
Recommendation: Use the new Edge browser, which has something called IE Mode to support legacy web application:
https://docs.microsoft.com/en-us/deployedge/edge-ie-mode
That would mean that Metabase could be used with the Edge (modern rendering), while legacy intranet web applications that only works on IE could also be used in the same browser.
----
Seems like removing `align-start` from this line (which is the only place it's used) will make it work correctly in IE11 - not sure if there are any negative side-effects, but looks normal in Firefox:
https://github.com/metabase/metabase/blob/f4d90f688ae6b1d01f556b3f76edbec174ce5c2b/frontend/src/metabase/dashboard/components/Dashboard.jsx#L316
----
**To Reproduce**
1. Create a dashboard and add multiple filters until there are more than screen width
2. On IE11 it looks like this (it also squishes the sidebar, when there is a lot of filter widgets):

And depending on the length of the filter name, this can mean that only a few filters are available for usage:

**Expected behavior**
As close to modern browsers as possible - this is Firefox:

**Information about your Metabase Installation:**
Tested 0.37.0 thru 0.38.2
**Additional context**
Using virtual machine IE11 image via https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/ | 1.0 | Dashboard filters overflow the screen making it not possible to use them on IE11 - **Describe the bug**
Dashboard filters overflow the screen making it not possible to use them on IE11 - there's no horizontal scrollbar available.
Temporary workaround: Use the page zoom to zoom out (`CTRL`+`-`), set the filters and then reset the zoom level (`CTRL`+`0`)
Recommendation: Use the new Edge browser, which has something called IE Mode to support legacy web application:
https://docs.microsoft.com/en-us/deployedge/edge-ie-mode
That would mean that Metabase could be used with the Edge (modern rendering), while legacy intranet web applications that only works on IE could also be used in the same browser.
----
Seems like removing `align-start` from this line (which is the only place it's used) will make it work correctly in IE11 - not sure if there are any negative side-effects, but looks normal in Firefox:
https://github.com/metabase/metabase/blob/f4d90f688ae6b1d01f556b3f76edbec174ce5c2b/frontend/src/metabase/dashboard/components/Dashboard.jsx#L316
----
**To Reproduce**
1. Create a dashboard and add multiple filters until there are more than screen width
2. On IE11 it looks like this (it also squishes the sidebar, when there is a lot of filter widgets):

And depending on the length of the filter name, this can mean that only a few filters are available for usage:

**Expected behavior**
As close to modern browsers as possible - this is Firefox:

**Information about your Metabase Installation:**
Tested 0.37.0 thru 0.38.2
**Additional context**
Using virtual machine IE11 image via https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/ | priority | dashboard filters overflow the screen making it not possible to use them on describe the bug dashboard filters overflow the screen making it not possible to use them on there s no horizontal scrollbar available temporary workaround use the page zoom to zoom out ctrl set the filters and then reset the zoom level ctrl recommendation use the new edge browser which has something called ie mode to support legacy web application that would mean that metabase could be used with the edge modern rendering while legacy intranet web applications that only works on ie could also be used in the same browser seems like removing align start from this line which is the only place it s used will make it work correctly in not sure if there are any negative side effects but looks normal in firefox to reproduce create a dashboard and add multiple filters until there are more than screen width on it looks like this it also squishes the sidebar when there is a lot of filter widgets and depending on the length of the filter name this can mean that only a few filters are available for usage expected behavior as close to modern browsers as possible this is firefox information about your metabase installation tested thru additional context using virtual machine image via | 1 |
11,244 | 9,291,965,633 | IssuesEvent | 2019-03-22 00:46:49 | Azure/azure-sdk-for-js | https://api.github.com/repos/Azure/azure-sdk-for-js | opened | [Service Bus] detached() on clients should not be a public method | Client Service Bus | _From API review notes in https://github.com/Azure/azure-sdk-for-js/issues/1481_
The `detached()` method on the QueueClient/TopicClient/SubscriptionClient is currently a public method.
This function is internally used when the library tries to recover from network issues.
All it does is call `detached()` on each of the senders and receivers on the client.
Since it is meant only for internal use, it should not be a public method.
Proposed solution:
Get rid of the `ClientEntityContext`. All this does is keep track of all senders/receivers for the client. Move this tracking up to the `ConnectionContext` on the `Namespace` class. This way, we don't need a `detached()` on the clients anymore.
When we need to recover senders/receivers, the `ConnectionContext` can now safely call `detached` on the individual senders/receivers | 1.0 | [Service Bus] detached() on clients should not be a public method - _From API review notes in https://github.com/Azure/azure-sdk-for-js/issues/1481_
The `detached()` method on the QueueClient/TopicClient/SubscriptionClient is currently a public method.
This function is internally used when the library tries to recover from network issues.
All it does is call `detached()` on each of the senders and receivers on the client.
Since it is meant only for internal use, it should not be a public method.
Proposed solution:
Get rid of the `ClientEntityContext`. All this does is keep track of all senders/receivers for the client. Move this tracking up to the `ConnectionContext` on the `Namespace` class. This way, we don't need a `detached()` on the clients anymore.
When we need to recover senders/receivers, the `ConnectionContext` can now safely call `detached` on the individual senders/receivers | non_priority | detached on clients should not be a public method from api review notes in the detached method on the queueclient topicclient subscriptionclient is currently a public method this function is internally used when the library tries to recover from network issues all it does is call detached on each of the senders and receivers on the client since it is meant only for internal use it should not be a public method proposed solution get rid of the cliententitycontext all this does is keep track of all senders receivers for the client move this tracking up to the connectioncontext on the namespace class this way we don t need a detached on the clients anymore when we need to recover senders receivers the connectioncontext can now safely call detached on the individual senders receivers | 0 |
72,385 | 19,193,334,761 | IssuesEvent | 2021-12-06 05:22:24 | tpaviot/pythonocc-core | https://api.github.com/repos/tpaviot/pythonocc-core | closed | LNK2019: unresolved external symbol | compilation/build system | Hi, I have been trying to build the pythonocc from source code.
The platform is Win10, compiler is msvc from vs2019 community and I have tried to build both the pythonocc 7.5.1 version and 7.5.0rc1 version but they all failed
First I successfully build the opencscade 7.5.1 and opencascade 7.5.0, and then I followed the instruction in install.md to build pythonocc 7.5.1 and pythonocc 7.5.0rc1 , the swig version is 3.0.11. The erros are same, after building All_Build in vs2019 there are many LNK2019 erros, one of them is paste here:
Severity Code Description Project File Line Suppression State
Error LNK2019 unresolved external symbol "public: __cdecl TDF_Transaction::TDF_Transaction(class TCollection_AsciiString const &)" (??0TDF_Transaction@@QEAA@AEBVTCollection_AsciiString@@@Z) referenced in function "public: void __cdecl TDF_Transaction::`default constructor closure'(void)" (??_FTDF_Transaction@@QEAAXXZ) _IGESCAFControl C:\Googol_Projects\FREECAD0.2\PYTHONOCC751\pythonocc-core-751-build\IGESCAFControlPYTHON_wrap.obj 1
I have 287 projects in pythonocc successfully built but 17 failed. There are many LNK2019 erros. How should I resolve this issue? Thank you very much! | 1.0 | LNK2019: unresolved external symbol - Hi, I have been trying to build the pythonocc from source code.
The platform is Win10, compiler is msvc from vs2019 community and I have tried to build both the pythonocc 7.5.1 version and 7.5.0rc1 version but they all failed
First I successfully build the opencscade 7.5.1 and opencascade 7.5.0, and then I followed the instruction in install.md to build pythonocc 7.5.1 and pythonocc 7.5.0rc1 , the swig version is 3.0.11. The erros are same, after building All_Build in vs2019 there are many LNK2019 erros, one of them is paste here:
Severity Code Description Project File Line Suppression State
Error LNK2019 unresolved external symbol "public: __cdecl TDF_Transaction::TDF_Transaction(class TCollection_AsciiString const &)" (??0TDF_Transaction@@QEAA@AEBVTCollection_AsciiString@@@Z) referenced in function "public: void __cdecl TDF_Transaction::`default constructor closure'(void)" (??_FTDF_Transaction@@QEAAXXZ) _IGESCAFControl C:\Googol_Projects\FREECAD0.2\PYTHONOCC751\pythonocc-core-751-build\IGESCAFControlPYTHON_wrap.obj 1
I have 287 projects in pythonocc successfully built but 17 failed. There are many LNK2019 erros. How should I resolve this issue? Thank you very much! | non_priority | unresolved external symbol hi i have been trying to build the pythonocc from source code the platform is compiler is msvc from community and i have tried to build both the pythonocc version and version but they all failed first i successfully build the opencscade and opencascade and then i followed the instruction in install md to build pythonocc and pythonocc the swig version is the erros are same after building all build in there are many erros one of them is paste here severity code description project file line suppression state error unresolved external symbol public cdecl tdf transaction tdf transaction class tcollection asciistring const transaction qeaa aebvtcollection asciistring z referenced in function public void cdecl tdf transaction default constructor closure void ftdf transaction qeaaxxz igescafcontrol c googol projects pythonocc core build igescafcontrolpython wrap obj i have projects in pythonocc successfully built but failed there are many erros how should i resolve this issue thank you very much | 0 |
21,130 | 2,633,512,327 | IssuesEvent | 2015-03-09 04:09:40 | PeaceGeeks/amani | https://api.github.com/repos/PeaceGeeks/amani | opened | Ticket from Aid Watch on Localization | Top Priority | The page content type is not fully fixed. There should be a "Translate" tab if you edit a page (View/edit/[translate]/devel.) and this tab is still missing. You can compare the About content type for pointers.
For the Main Menu (About, News, Resources, ... Contact), there seems to be an issue with translation as well (not sure if caused by Superfish), check
http://dev.aidwatch.peacegeeks.org/ar/about (notice the location of the Arabic text which should go uder About ( = Why Aid Watch Palestine?)
The menu item is translated at
http://dev.aidwatch.peacegeeks.org/admin/structure/menu/item/1001/translate (it shows Arabic as disabled in the drop down box).
I did not check the rest of the content type yet. | 1.0 | Ticket from Aid Watch on Localization - The page content type is not fully fixed. There should be a "Translate" tab if you edit a page (View/edit/[translate]/devel.) and this tab is still missing. You can compare the About content type for pointers.
For the Main Menu (About, News, Resources, ... Contact), there seems to be an issue with translation as well (not sure if caused by Superfish), check
http://dev.aidwatch.peacegeeks.org/ar/about (notice the location of the Arabic text which should go uder About ( = Why Aid Watch Palestine?)
The menu item is translated at
http://dev.aidwatch.peacegeeks.org/admin/structure/menu/item/1001/translate (it shows Arabic as disabled in the drop down box).
I did not check the rest of the content type yet. | priority | ticket from aid watch on localization the page content type is not fully fixed there should be a translate tab if you edit a page view edit devel and this tab is still missing you can compare the about content type for pointers for the main menu about news resources contact there seems to be an issue with translation as well not sure if caused by superfish check notice the location of the arabic text which should go uder about why aid watch palestine the menu item is translated at it shows arabic as disabled in the drop down box i did not check the rest of the content type yet | 1 |
132,335 | 12,504,444,245 | IssuesEvent | 2020-06-02 09:01:18 | ocaml/ocaml | https://api.github.com/repos/ocaml/ocaml | closed | Online Manual does not list builtin types and builtin exception in index | Stale documentation | **Original bug ID:** 7708
**Reporter:** @hannesm
**Status:** new
**Resolution:** open
**Priority:** normal
**Severity:** minor
**Version:** 4.06.0
**Category:** ocamldoc
**Monitored by:** @Drup
## Bug description
The very nice manual includes a "list of exceptions" at http://caml.inria.fr/pub/docs/manual-ocaml/libref/index_exceptions.html
This is very useful to read up on exceptions, but the builtin ones, like Invalid_argument, Out_of_memory, Failure, etc. are not listed there (they are explained the manual (https://caml.inria.fr/pub/docs/manual-ocaml/core.html#sec547)).
It would be really great to extend the index with these builtin exceptions (and types) for a better user experience.
| 1.0 | Online Manual does not list builtin types and builtin exception in index - **Original bug ID:** 7708
**Reporter:** @hannesm
**Status:** new
**Resolution:** open
**Priority:** normal
**Severity:** minor
**Version:** 4.06.0
**Category:** ocamldoc
**Monitored by:** @Drup
## Bug description
The very nice manual includes a "list of exceptions" at http://caml.inria.fr/pub/docs/manual-ocaml/libref/index_exceptions.html
This is very useful to read up on exceptions, but the builtin ones, like Invalid_argument, Out_of_memory, Failure, etc. are not listed there (they are explained the manual (https://caml.inria.fr/pub/docs/manual-ocaml/core.html#sec547)).
It would be really great to extend the index with these builtin exceptions (and types) for a better user experience.
| non_priority | online manual does not list builtin types and builtin exception in index original bug id reporter hannesm status new resolution open priority normal severity minor version category ocamldoc monitored by drup bug description the very nice manual includes a list of exceptions at this is very useful to read up on exceptions but the builtin ones like invalid argument out of memory failure etc are not listed there they are explained the manual it would be really great to extend the index with these builtin exceptions and types for a better user experience | 0 |
153,930 | 5,906,127,973 | IssuesEvent | 2017-05-19 14:29:52 | elsevier-core-engineering/replicator | https://api.github.com/repos/elsevier-core-engineering/replicator | closed | Make Cluster Scaling Retry Threshold a User Configurable Option | enhancement high-priority | **Description**
When extended node validation was introduced in #62, the new configuration structure was not yet implemented. Now that both changes have been merged, we need to add a new configuration flag to allow users to specify the cluster scaling retry threshold. | 1.0 | Make Cluster Scaling Retry Threshold a User Configurable Option - **Description**
When extended node validation was introduced in #62, the new configuration structure was not yet implemented. Now that both changes have been merged, we need to add a new configuration flag to allow users to specify the cluster scaling retry threshold. | priority | make cluster scaling retry threshold a user configurable option description when extended node validation was introduced in the new configuration structure was not yet implemented now that both changes have been merged we need to add a new configuration flag to allow users to specify the cluster scaling retry threshold | 1 |
311,292 | 9,531,583,293 | IssuesEvent | 2019-04-29 16:20:53 | CCAFS/MARLO | https://api.github.com/repos/CCAFS/MARLO | closed | Documento de entrega de puesto | Priority - Medium in progress | Actualizar el documento de entrega de puesto, teniendo como base el elaborado por Paola Camargo | 1.0 | Documento de entrega de puesto - Actualizar el documento de entrega de puesto, teniendo como base el elaborado por Paola Camargo | priority | documento de entrega de puesto actualizar el documento de entrega de puesto teniendo como base el elaborado por paola camargo | 1 |
822,372 | 30,868,760,619 | IssuesEvent | 2023-08-03 09:50:33 | TalaoDAO/AltMe | https://api.github.com/repos/TalaoDAO/AltMe | closed | resolver for did:web to be coded | Priority AltMe | first solution is to use the universal resolver (external serveur) to get teh DID Document
[12 h 55](https://talao.slack.com/archives/C03V9HU19DM/p1690973723242089)
url = 'https://dev.uniresolver.io/1.0/identifiers/' + did
r = requests.get(url)
[12 h 57](https://talao.slack.com/archives/C03V9HU19DM/p1690973860348219)
Example :
DID Document -> https://dev.uniresolver.io/1.0/identifiers/did:web:app.altme.io:issuer (modifié)
[12 h 58](https://talao.slack.com/archives/C03V9HU19DM/p1690973889076089)
end you extract the pub key with the verification method
[13 h 01](https://talao.slack.com/archives/C03V9HU19DM/p1690974112048879)
if the universal resolver is not available then status red, if verification fails status red
[13 h 03](https://talao.slack.com/archives/C03V9HU19DM/p1690974205795889)
in this DID https://dev.uniresolver.io/1.0/identifiers/did:web:talao.co
there are 3 keys, that is why we need the verification method to get the good one | 1.0 | resolver for did:web to be coded - first solution is to use the universal resolver (external serveur) to get teh DID Document
[12 h 55](https://talao.slack.com/archives/C03V9HU19DM/p1690973723242089)
url = 'https://dev.uniresolver.io/1.0/identifiers/' + did
r = requests.get(url)
[12 h 57](https://talao.slack.com/archives/C03V9HU19DM/p1690973860348219)
Example :
DID Document -> https://dev.uniresolver.io/1.0/identifiers/did:web:app.altme.io:issuer (modifié)
[12 h 58](https://talao.slack.com/archives/C03V9HU19DM/p1690973889076089)
end you extract the pub key with the verification method
[13 h 01](https://talao.slack.com/archives/C03V9HU19DM/p1690974112048879)
if the universal resolver is not available then status red, if verification fails status red
[13 h 03](https://talao.slack.com/archives/C03V9HU19DM/p1690974205795889)
in this DID https://dev.uniresolver.io/1.0/identifiers/did:web:talao.co
there are 3 keys, that is why we need the verification method to get the good one | priority | resolver for did web to be coded first solution is to use the universal resolver external serveur to get teh did document url did r requests get url example did document modifié end you extract the pub key with the verification method if the universal resolver is not available then status red if verification fails status red in this did there are keys that is why we need the verification method to get the good one | 1 |
27,851 | 5,114,156,145 | IssuesEvent | 2017-01-06 17:32:08 | edno/kleis | https://api.github.com/repos/edno/kleis | closed | Footer not properly displayed | defect | After deployment of the current *develop* branch, the footer is not correctly displayed

(cache refresh required)
| 1.0 | Footer not properly displayed - After deployment of the current *develop* branch, the footer is not correctly displayed

(cache refresh required)
| non_priority | footer not properly displayed after deployment of the current develop branch the footer is not correctly displayed cache refresh required | 0 |
302,673 | 9,285,327,122 | IssuesEvent | 2019-03-21 06:35:58 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | support.mozilla.org - see bug description | browser-firefox-mobile priority-important | <!-- @browser: Firefox Mobile (Tablet) SM T590 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Tablet; rv:67.0) Gecko/67.0 Firefox/67.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://support.mozilla.org/de/questions/1245679#question-reply
**Browser / Version**: Firefox Mobile (Tablet) SM T590
**Operating System**: Android 8.1.0
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: how i can delete Pocket at my Tablet?
**Steps to Reproduce**:
I would like to uninstall Pocket because I do not care.
[](https://webcompat.com/uploads/2019/3/8a39ee9a-2123-4b81-8950-9c45b963167d.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190306095759</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: true</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: nightly</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[console.log(JQMIGRATE: Logging is active) https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6011]', u'[console.warn(JQMIGRATE: jQuery.browser is deprecated) https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6352]', u'[console.trace() https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6423]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://location.services.mozilla.com/v1/country?key=fa6d7fc9-e091-4be1-b6c1-5ada5815ae9d. (Reason: missing token x-csrftoken in CORS header Access-Control-Allow-Headers from CORS preflight channel)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://location.services.mozilla.com/v1/country?key=fa6d7fc9-e091-4be1-b6c1-5ada5815ae9d. (Reason: CORS request did not succeed)."]', u'[JavaScript Error: "Error: Error retrieving geoip data" {file: "https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js" line: 10}]\n@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:10:27787\nc@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:4:2996\nfireWith@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:4:3801\nk@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:5:23260\nr@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:5:27623\n']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | support.mozilla.org - see bug description - <!-- @browser: Firefox Mobile (Tablet) SM T590 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Tablet; rv:67.0) Gecko/67.0 Firefox/67.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://support.mozilla.org/de/questions/1245679#question-reply
**Browser / Version**: Firefox Mobile (Tablet) SM T590
**Operating System**: Android 8.1.0
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: how i can delete Pocket at my Tablet?
**Steps to Reproduce**:
I would like to uninstall Pocket because I do not care.
[](https://webcompat.com/uploads/2019/3/8a39ee9a-2123-4b81-8950-9c45b963167d.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190306095759</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: true</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: nightly</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[console.log(JQMIGRATE: Logging is active) https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6011]', u'[console.warn(JQMIGRATE: jQuery.browser is deprecated) https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6352]', u'[console.trace() https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:6:6423]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://location.services.mozilla.com/v1/country?key=fa6d7fc9-e091-4be1-b6c1-5ada5815ae9d. (Reason: missing token x-csrftoken in CORS header Access-Control-Allow-Headers from CORS preflight channel)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://location.services.mozilla.com/v1/country?key=fa6d7fc9-e091-4be1-b6c1-5ada5815ae9d. (Reason: CORS request did not succeed)."]', u'[JavaScript Error: "Error: Error retrieving geoip data" {file: "https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js" line: 10}]\n@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:10:27787\nc@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:4:2996\nfireWith@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:4:3801\nk@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:5:23260\nr@https://static-media-prod-cdn.sumo.mozilla.net/static/build/common-min.4b219b53323f.js:5:27623\n']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | support mozilla org see bug description url browser version firefox mobile tablet sm operating system android tested another browser yes problem type something else description how i can delete pocket at my tablet steps to reproduce i would like to uninstall pocket because i do not care browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked false gfx webrender blob images true hastouchscreen true mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel nightly console messages u u u u u n from with ❤️ | 1 |
832,130 | 32,073,025,283 | IssuesEvent | 2023-09-25 09:13:51 | Avaiga/taipy-core | https://api.github.com/repos/Avaiga/taipy-core | closed | Problem of version compability even with Develop mode on | Core: Versioning 🟧 Priority: High 💥Malfunction | **Description**
Changing configuration might create issues in Taipy Core even in Develop mode. Changing the name of the function used by a task raises an error.
**How to reproduce**
- Run the code
```python
from taipy import Config, Scope, Frequency
import datetime as dt
import taipy as tp
import pandas as pd
PATH_CSV = 'dataset.csv'
PATH_PARQUET = 'dataset.parquet'
def transform(csv, parquet, pickle):
print(" Cleaning data")
return csv, 5, "hello", dt.datetime.now()
## Input Data csv
csv_cfg = Config.configure_data_node(id="csv", storage_type="csv", path=PATH_CSV)#scope=Scope.GLOBAL,
parquet_cfg = Config.configure_data_node(id="parquet", scope=Scope.CYCLE, storage_type="parquet", path=PATH_PARQUET)
pickle_cfg = Config.configure_data_node(id="pickle")
## Remaining Data Node
data_out_cfg = Config.configure_data_node(id="data_out")
int_cfg = Config.configure_data_node(id="int")
string_cfg = Config.configure_data_node(id="string")
date_cfg = Config.configure_data_node(id="date")
# Task config objects
transform_task_cfg = Config.configure_task(id="transform",
function=transform,
input=[csv_cfg,parquet_cfg,pickle_cfg],
output=[data_out_cfg, int_cfg, string_cfg, date_cfg],
skippable=True)
# Configure our scenario config.
scenario_cfg = Config.configure_scenario(id="scenario", task_configs=[transform_task_cfg], frequency=Frequency.MONTHLY)
if __name__ == "__main__":
tp.Core().run()
scenario = tp.create_scenario(config=scenario_cfg)
data_pickle = pd.DataFrame({"Hello": [1, 2, 3], "World": [4, 5, 6]})
scenario.pickle.write(data_pickle)
data_csv = pd.DataFrame({"Hi": ["red", "step", 'true'], "World": [None, 5, 6]})
data_csv.to_csv(PATH_CSV)
data_parquet = pd.DataFrame({"Date":[dt.datetime(2021, 1, 1), dt.datetime(2021, 1, 2), dt.datetime(2021, 1, 3)], "Value": [1, 2, 3]})
data_parquet.to_parquet(PATH_PARQUET)
print(scenario.pickle.read())
```
- Rename the function used:
```python
from taipy import Config, Scope, Frequency
import datetime as dt
import taipy as tp
import pandas as pd
PATH_CSV = 'dataset.csv'
PATH_PARQUET = 'dataset.parquet'
def other_function(csv, parquet, pickle):
print(" Cleaning data")
return csv, 5, "hello", dt.datetime.now()
## Input Data csv
csv_cfg = Config.configure_data_node(id="csv", storage_type="csv", path=PATH_CSV)#scope=Scope.GLOBAL,
parquet_cfg = Config.configure_data_node(id="parquet", scope=Scope.CYCLE, storage_type="parquet", path=PATH_PARQUET)
pickle_cfg = Config.configure_data_node(id="pickle")
## Remaining Data Node
data_out_cfg = Config.configure_data_node(id="data_out")
int_cfg = Config.configure_data_node(id="int")
string_cfg = Config.configure_data_node(id="string")
date_cfg = Config.configure_data_node(id="date")
# Task config objects
transform_task_cfg = Config.configure_task(id="transform",
function=other_function,
input=[csv_cfg,parquet_cfg,pickle_cfg],
output=[data_out_cfg, int_cfg, string_cfg, date_cfg],
skippable=True)
# Configure our scenario config.
scenario_cfg = Config.configure_scenario(id="scenario", task_configs=[transform_task_cfg], frequency=Frequency.MONTHLY)
if __name__ == "__main__":
tp.Core().run()
scenario = tp.create_scenario(config=scenario_cfg)
data_pickle = pd.DataFrame({"Hello": [1, 2, 3], "World": [4, 5, 6]})
scenario.pickle.write(data_pickle)
data_csv = pd.DataFrame({"Hi": ["red", "step", 'true'], "World": [None, 5, 6]})
data_csv.to_csv(PATH_CSV)
data_parquet = pd.DataFrame({"Date":[dt.datetime(2021, 1, 1), dt.datetime(2021, 1, 2), dt.datetime(2021, 1, 3)], "Value": [1, 2, 3]})
data_parquet.to_parquet(PATH_PARQUET)
print(scenario.pickle.read())
```
**Expected behavior**
This shouldn't create any issue as we are in develop mode for Taipy Core.
**Screenshots**
When available and relevant, screenshots better help show the problem.
**Runtime environment**
Taipy 3.0 develop
| 1.0 | Problem of version compability even with Develop mode on - **Description**
Changing configuration might create issues in Taipy Core even in Develop mode. Changing the name of the function used by a task raises an error.
**How to reproduce**
- Run the code
```python
from taipy import Config, Scope, Frequency
import datetime as dt
import taipy as tp
import pandas as pd
PATH_CSV = 'dataset.csv'
PATH_PARQUET = 'dataset.parquet'
def transform(csv, parquet, pickle):
print(" Cleaning data")
return csv, 5, "hello", dt.datetime.now()
## Input Data csv
csv_cfg = Config.configure_data_node(id="csv", storage_type="csv", path=PATH_CSV)#scope=Scope.GLOBAL,
parquet_cfg = Config.configure_data_node(id="parquet", scope=Scope.CYCLE, storage_type="parquet", path=PATH_PARQUET)
pickle_cfg = Config.configure_data_node(id="pickle")
## Remaining Data Node
data_out_cfg = Config.configure_data_node(id="data_out")
int_cfg = Config.configure_data_node(id="int")
string_cfg = Config.configure_data_node(id="string")
date_cfg = Config.configure_data_node(id="date")
# Task config objects
transform_task_cfg = Config.configure_task(id="transform",
function=transform,
input=[csv_cfg,parquet_cfg,pickle_cfg],
output=[data_out_cfg, int_cfg, string_cfg, date_cfg],
skippable=True)
# Configure our scenario config.
scenario_cfg = Config.configure_scenario(id="scenario", task_configs=[transform_task_cfg], frequency=Frequency.MONTHLY)
if __name__ == "__main__":
tp.Core().run()
scenario = tp.create_scenario(config=scenario_cfg)
data_pickle = pd.DataFrame({"Hello": [1, 2, 3], "World": [4, 5, 6]})
scenario.pickle.write(data_pickle)
data_csv = pd.DataFrame({"Hi": ["red", "step", 'true'], "World": [None, 5, 6]})
data_csv.to_csv(PATH_CSV)
data_parquet = pd.DataFrame({"Date":[dt.datetime(2021, 1, 1), dt.datetime(2021, 1, 2), dt.datetime(2021, 1, 3)], "Value": [1, 2, 3]})
data_parquet.to_parquet(PATH_PARQUET)
print(scenario.pickle.read())
```
- Rename the function used:
```python
from taipy import Config, Scope, Frequency
import datetime as dt
import taipy as tp
import pandas as pd
PATH_CSV = 'dataset.csv'
PATH_PARQUET = 'dataset.parquet'
def other_function(csv, parquet, pickle):
print(" Cleaning data")
return csv, 5, "hello", dt.datetime.now()
## Input Data csv
csv_cfg = Config.configure_data_node(id="csv", storage_type="csv", path=PATH_CSV)#scope=Scope.GLOBAL,
parquet_cfg = Config.configure_data_node(id="parquet", scope=Scope.CYCLE, storage_type="parquet", path=PATH_PARQUET)
pickle_cfg = Config.configure_data_node(id="pickle")
## Remaining Data Node
data_out_cfg = Config.configure_data_node(id="data_out")
int_cfg = Config.configure_data_node(id="int")
string_cfg = Config.configure_data_node(id="string")
date_cfg = Config.configure_data_node(id="date")
# Task config objects
transform_task_cfg = Config.configure_task(id="transform",
function=other_function,
input=[csv_cfg,parquet_cfg,pickle_cfg],
output=[data_out_cfg, int_cfg, string_cfg, date_cfg],
skippable=True)
# Configure our scenario config.
scenario_cfg = Config.configure_scenario(id="scenario", task_configs=[transform_task_cfg], frequency=Frequency.MONTHLY)
if __name__ == "__main__":
tp.Core().run()
scenario = tp.create_scenario(config=scenario_cfg)
data_pickle = pd.DataFrame({"Hello": [1, 2, 3], "World": [4, 5, 6]})
scenario.pickle.write(data_pickle)
data_csv = pd.DataFrame({"Hi": ["red", "step", 'true'], "World": [None, 5, 6]})
data_csv.to_csv(PATH_CSV)
data_parquet = pd.DataFrame({"Date":[dt.datetime(2021, 1, 1), dt.datetime(2021, 1, 2), dt.datetime(2021, 1, 3)], "Value": [1, 2, 3]})
data_parquet.to_parquet(PATH_PARQUET)
print(scenario.pickle.read())
```
**Expected behavior**
This shouldn't create any issue as we are in develop mode for Taipy Core.
**Screenshots**
When available and relevant, screenshots better help show the problem.
**Runtime environment**
Taipy 3.0 develop
| priority | problem of version compability even with develop mode on description changing configuration might create issues in taipy core even in develop mode changing the name of the function used by a task raises an error how to reproduce run the code python from taipy import config scope frequency import datetime as dt import taipy as tp import pandas as pd path csv dataset csv path parquet dataset parquet def transform csv parquet pickle print cleaning data return csv hello dt datetime now input data csv csv cfg config configure data node id csv storage type csv path path csv scope scope global parquet cfg config configure data node id parquet scope scope cycle storage type parquet path path parquet pickle cfg config configure data node id pickle remaining data node data out cfg config configure data node id data out int cfg config configure data node id int string cfg config configure data node id string date cfg config configure data node id date task config objects transform task cfg config configure task id transform function transform input output skippable true configure our scenario config scenario cfg config configure scenario id scenario task configs frequency frequency monthly if name main tp core run scenario tp create scenario config scenario cfg data pickle pd dataframe hello world scenario pickle write data pickle data csv pd dataframe hi world data csv to csv path csv data parquet pd dataframe date value data parquet to parquet path parquet print scenario pickle read rename the function used python from taipy import config scope frequency import datetime as dt import taipy as tp import pandas as pd path csv dataset csv path parquet dataset parquet def other function csv parquet pickle print cleaning data return csv hello dt datetime now input data csv csv cfg config configure data node id csv storage type csv path path csv scope scope global parquet cfg config configure data node id parquet scope scope cycle storage type parquet path path parquet pickle cfg config configure data node id pickle remaining data node data out cfg config configure data node id data out int cfg config configure data node id int string cfg config configure data node id string date cfg config configure data node id date task config objects transform task cfg config configure task id transform function other function input output skippable true configure our scenario config scenario cfg config configure scenario id scenario task configs frequency frequency monthly if name main tp core run scenario tp create scenario config scenario cfg data pickle pd dataframe hello world scenario pickle write data pickle data csv pd dataframe hi world data csv to csv path csv data parquet pd dataframe date value data parquet to parquet path parquet print scenario pickle read expected behavior this shouldn t create any issue as we are in develop mode for taipy core screenshots when available and relevant screenshots better help show the problem runtime environment taipy develop | 1 |
84,028 | 7,887,931,478 | IssuesEvent | 2018-06-27 20:13:39 | btcpayserver/commerce_btcpay | https://api.github.com/repos/btcpayserver/commerce_btcpay | opened | add tests | tests | Challenge: the other commerce payment plugins use the api credentials of the payment providers sandbox urls. As BTCPay is self hosted there is no such thing.
Option 1: mock the responses and payment providers
Option 2: write tests against a public test BTCPay instance, maybe the official one? (Needs to work with existing paired keys or allow to automate that somehow)
Opition 3: leave out those tests :(
| 1.0 | add tests - Challenge: the other commerce payment plugins use the api credentials of the payment providers sandbox urls. As BTCPay is self hosted there is no such thing.
Option 1: mock the responses and payment providers
Option 2: write tests against a public test BTCPay instance, maybe the official one? (Needs to work with existing paired keys or allow to automate that somehow)
Opition 3: leave out those tests :(
| non_priority | add tests challenge the other commerce payment plugins use the api credentials of the payment providers sandbox urls as btcpay is self hosted there is no such thing option mock the responses and payment providers option write tests against a public test btcpay instance maybe the official one needs to work with existing paired keys or allow to automate that somehow opition leave out those tests | 0 |
221,837 | 17,372,445,265 | IssuesEvent | 2021-07-30 15:41:38 | QubesOS/updates-status | https://api.github.com/repos/QubesOS/updates-status | closed | dist-upgrade v4.0.1 (r4.0) | r4.0-dom0-cur-test | Update of dist-upgrade to v4.0.1 for Qubes r4.0, see comments below for details.
Built from: https://github.com/QubesOS/qubes-dist-upgrade/commit/9403b03e7c2067b13cb7e2dba17df95a4ed200a6
[Changes since previous version](https://github.com/QubesOS/qubes-dist-upgrade/compare/v4.0.0...v4.0.1):
QubesOS/qubes-dist-upgrade@9403b03 version 4.0.1
QubesOS/qubes-dist-upgrade@b0cf3b0 Include upgrade-template-standalone.sh in the package
QubesOS/qubes-dist-upgrade@4915f2c Replace Travis with Gitlab-ci
QubesOS/qubes-dist-upgrade@92a1b9e Debian: use apt dist-upgrade
QubesOS/qubes-dist-upgrade@6218ec9 Improve latest info message
QubesOS/qubes-dist-upgrade@5fc0d57 Fix typo in recommended_size
QubesOS/qubes-dist-upgrade@ccd64d0 Handle TemplateVM and StandaloneVM upgrade
QubesOS/qubes-dist-upgrade@0611811 Wrap the shutdown of unnecessary VMs
QubesOS/qubes-dist-upgrade@738a930 Install the script as /usr/sbin/qubes-dist-upgrade
QubesOS/qubes-dist-upgrade@315cd60 Add --all option to execute all the stages in one call
QubesOS/qubes-dist-upgrade@4d77cf4 Automatically detect LVM pool name
QubesOS/qubes-dist-upgrade@8e5f208 exclude running kernel from dnf distro-sync
QubesOS/qubes-dist-upgrade@9a59e09 Use lvs --nosuffix instead of manually stripping units
Referenced issues:
If you're release manager, you can issue GPG-inline signed command:
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 current repo` (available 7 days from now)
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now)
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 security-testing repo`
Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
| 1.0 | dist-upgrade v4.0.1 (r4.0) - Update of dist-upgrade to v4.0.1 for Qubes r4.0, see comments below for details.
Built from: https://github.com/QubesOS/qubes-dist-upgrade/commit/9403b03e7c2067b13cb7e2dba17df95a4ed200a6
[Changes since previous version](https://github.com/QubesOS/qubes-dist-upgrade/compare/v4.0.0...v4.0.1):
QubesOS/qubes-dist-upgrade@9403b03 version 4.0.1
QubesOS/qubes-dist-upgrade@b0cf3b0 Include upgrade-template-standalone.sh in the package
QubesOS/qubes-dist-upgrade@4915f2c Replace Travis with Gitlab-ci
QubesOS/qubes-dist-upgrade@92a1b9e Debian: use apt dist-upgrade
QubesOS/qubes-dist-upgrade@6218ec9 Improve latest info message
QubesOS/qubes-dist-upgrade@5fc0d57 Fix typo in recommended_size
QubesOS/qubes-dist-upgrade@ccd64d0 Handle TemplateVM and StandaloneVM upgrade
QubesOS/qubes-dist-upgrade@0611811 Wrap the shutdown of unnecessary VMs
QubesOS/qubes-dist-upgrade@738a930 Install the script as /usr/sbin/qubes-dist-upgrade
QubesOS/qubes-dist-upgrade@315cd60 Add --all option to execute all the stages in one call
QubesOS/qubes-dist-upgrade@4d77cf4 Automatically detect LVM pool name
QubesOS/qubes-dist-upgrade@8e5f208 exclude running kernel from dnf distro-sync
QubesOS/qubes-dist-upgrade@9a59e09 Use lvs --nosuffix instead of manually stripping units
Referenced issues:
If you're release manager, you can issue GPG-inline signed command:
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 current repo` (available 7 days from now)
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now)
* `Upload dist-upgrade 9403b03e7c2067b13cb7e2dba17df95a4ed200a6 r4.0 security-testing repo`
Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
| non_priority | dist upgrade update of dist upgrade to for qubes see comments below for details built from qubesos qubes dist upgrade version qubesos qubes dist upgrade include upgrade template standalone sh in the package qubesos qubes dist upgrade replace travis with gitlab ci qubesos qubes dist upgrade debian use apt dist upgrade qubesos qubes dist upgrade improve latest info message qubesos qubes dist upgrade fix typo in recommended size qubesos qubes dist upgrade handle templatevm and standalonevm upgrade qubesos qubes dist upgrade wrap the shutdown of unnecessary vms qubesos qubes dist upgrade install the script as usr sbin qubes dist upgrade qubesos qubes dist upgrade add all option to execute all the stages in one call qubesos qubes dist upgrade automatically detect lvm pool name qubesos qubes dist upgrade exclude running kernel from dnf distro sync qubesos qubes dist upgrade use lvs nosuffix instead of manually stripping units referenced issues if you re release manager you can issue gpg inline signed command upload dist upgrade current repo available days from now upload dist upgrade current dists repo you can choose subset of distributions like vm vm available days from now upload dist upgrade security testing repo above commands will work only if packages in current testing repository were built from given commit i e no new version superseded it | 0 |
65,332 | 27,066,330,232 | IssuesEvent | 2023-02-14 00:56:38 | hashicorp/terraform-provider-aws | https://api.github.com/repos/hashicorp/terraform-provider-aws | closed | [Bug]: IAM ARN not supported by the provider | bug service/iam | ### Terraform Core Version
1.3.3
### AWS Provider Version
4.51.5
### Affected Resource(s)
aws_iam_policy_document
### Expected Behavior
Terraform don't complaint when using ARN `arn:aws:iam::account:u2f/u2f-token-id`
### Actual Behavior
When applying a an IAM policy.
resource: `aws_iam_policy_document`
using the following resource is not allowed: `arn:aws:iam::account:u2f/u2f-token-id`
Terraform error is:
```shell
Error: updating IAM policy arn:aws:iam::111111111111:policy/IAMSelfManage: MalformedPolicyDocument: IAM resource path must either be "*" or start with user/, federated-user/, role/, group/, instance-profile/, mfa/, server-certificate/, policy/, sms-mfa/, saml-provider/, oidc-provider/, report/, access-report/. status code: 400, request id: XXXXXXXX-XXXXXXX-XXXXXXXf
with aws_iam_policy.iam_self_modify
on main.tf line XX, in resource "aws_iam_policy" "my_policy":
```
### Relevant Error/Panic Output Snippet
```shell
Error: updating IAM policy arn:aws:iam::111111111111:policy/IAMSelfManage: MalformedPolicyDocument: IAM resource path must either be "*" or start with user/, federated-user/, role/, group/, instance-profile/, mfa/, server-certificate/, policy/, sms-mfa/, saml-provider/, oidc-provider/, report/, access-report/. status code: 400, request id: XXXXXXXX-XXXXXXX-XXXXXXXf
with aws_iam_policy.iam_self_modify
on main.tf line XX, in resource "aws_iam_policy" "my_policy":
```
### Terraform Configuration Files
```terraform
data "aws_iam_policy_document" "my_policy" {
statement {
sid = "AllowManageOwnVirtualMFADevice"
effect = "Allow"
actions = [
"iam:CreateVirtualMFADevice",
]
resources = [
"arn:aws:iam::*:mfa/*",
"arn:aws:iam::*:u2f/*"
]
}
}
```
### Steps to Reproduce
Try to use resource `arn:aws:iam::account:u2f/u2f-token-id` in a `aws_iam_policy_document`
example:
```terraform
data "aws_iam_policy_document" "my_policy" {
statement {
sid = "AllowManageOwnVirtualMFADevice"
effect = "Allow"
actions = [
"iam:CreateVirtualMFADevice",
]
resources = [
"arn:aws:iam::*:mfa/*",
"arn:aws:iam::*:u2f/*"
]
}
}
```
### Debug Output
_No response_
### Panic Output
_No response_
### Important Factoids
_No response_
### References
_No response_
### Would you like to implement a fix?
None | 1.0 | [Bug]: IAM ARN not supported by the provider - ### Terraform Core Version
1.3.3
### AWS Provider Version
4.51.5
### Affected Resource(s)
aws_iam_policy_document
### Expected Behavior
Terraform don't complaint when using ARN `arn:aws:iam::account:u2f/u2f-token-id`
### Actual Behavior
When applying a an IAM policy.
resource: `aws_iam_policy_document`
using the following resource is not allowed: `arn:aws:iam::account:u2f/u2f-token-id`
Terraform error is:
```shell
Error: updating IAM policy arn:aws:iam::111111111111:policy/IAMSelfManage: MalformedPolicyDocument: IAM resource path must either be "*" or start with user/, federated-user/, role/, group/, instance-profile/, mfa/, server-certificate/, policy/, sms-mfa/, saml-provider/, oidc-provider/, report/, access-report/. status code: 400, request id: XXXXXXXX-XXXXXXX-XXXXXXXf
with aws_iam_policy.iam_self_modify
on main.tf line XX, in resource "aws_iam_policy" "my_policy":
```
### Relevant Error/Panic Output Snippet
```shell
Error: updating IAM policy arn:aws:iam::111111111111:policy/IAMSelfManage: MalformedPolicyDocument: IAM resource path must either be "*" or start with user/, federated-user/, role/, group/, instance-profile/, mfa/, server-certificate/, policy/, sms-mfa/, saml-provider/, oidc-provider/, report/, access-report/. status code: 400, request id: XXXXXXXX-XXXXXXX-XXXXXXXf
with aws_iam_policy.iam_self_modify
on main.tf line XX, in resource "aws_iam_policy" "my_policy":
```
### Terraform Configuration Files
```terraform
data "aws_iam_policy_document" "my_policy" {
statement {
sid = "AllowManageOwnVirtualMFADevice"
effect = "Allow"
actions = [
"iam:CreateVirtualMFADevice",
]
resources = [
"arn:aws:iam::*:mfa/*",
"arn:aws:iam::*:u2f/*"
]
}
}
```
### Steps to Reproduce
Try to use resource `arn:aws:iam::account:u2f/u2f-token-id` in a `aws_iam_policy_document`
example:
```terraform
data "aws_iam_policy_document" "my_policy" {
statement {
sid = "AllowManageOwnVirtualMFADevice"
effect = "Allow"
actions = [
"iam:CreateVirtualMFADevice",
]
resources = [
"arn:aws:iam::*:mfa/*",
"arn:aws:iam::*:u2f/*"
]
}
}
```
### Debug Output
_No response_
### Panic Output
_No response_
### Important Factoids
_No response_
### References
_No response_
### Would you like to implement a fix?
None | non_priority | iam arn not supported by the provider terraform core version aws provider version affected resource s aws iam policy document expected behavior terraform don t complaint when using arn arn aws iam account token id actual behavior when applying a an iam policy resource aws iam policy document using the following resource is not allowed arn aws iam account token id terraform error is shell error updating iam policy arn aws iam policy iamselfmanage malformedpolicydocument iam resource path must either be or start with user federated user role group instance profile mfa server certificate policy sms mfa saml provider oidc provider report access report status code request id xxxxxxxx xxxxxxx xxxxxxxf with aws iam policy iam self modify on main tf line xx in resource aws iam policy my policy relevant error panic output snippet shell error updating iam policy arn aws iam policy iamselfmanage malformedpolicydocument iam resource path must either be or start with user federated user role group instance profile mfa server certificate policy sms mfa saml provider oidc provider report access report status code request id xxxxxxxx xxxxxxx xxxxxxxf with aws iam policy iam self modify on main tf line xx in resource aws iam policy my policy terraform configuration files terraform data aws iam policy document my policy statement sid allowmanageownvirtualmfadevice effect allow actions iam createvirtualmfadevice resources arn aws iam mfa arn aws iam steps to reproduce try to use resource arn aws iam account token id in a aws iam policy document example terraform data aws iam policy document my policy statement sid allowmanageownvirtualmfadevice effect allow actions iam createvirtualmfadevice resources arn aws iam mfa arn aws iam debug output no response panic output no response important factoids no response references no response would you like to implement a fix none | 0 |
11,420 | 3,203,016,221 | IssuesEvent | 2015-10-02 16:47:56 | kubernetes/kubernetes | https://api.github.com/repos/kubernetes/kubernetes | closed | Multiple e2e reboot tests failing consistently with "Node [xxx] failed reboot test." | area/test kind/flake priority/P1 team/test-infra | e.g. kubernetes-e2e-gce-reboot/3412/
Identified problems
Reboot each node by dropping all outbound packets for a while and ensure they function afterwards
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:99
Aug 17 08:16:57.838: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by ordering clean reboot and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:65
Aug 17 08:17:23.058: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by dropping all inbound packets for a while and ensure they function afterwards
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:91
Aug 17 08:17:48.247: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by ordering unclean reboot and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:71
Aug 17 08:18:13.443: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by triggering kernel panic and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:77
Aug 17 08:18:38.648: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by switching off the network interface and ensure they function upon switch on
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:83
Aug 17 08:16:32.647: Node e2e-reboot-minion-6ovw failed reboot test. | 2.0 | Multiple e2e reboot tests failing consistently with "Node [xxx] failed reboot test." - e.g. kubernetes-e2e-gce-reboot/3412/
Identified problems
Reboot each node by dropping all outbound packets for a while and ensure they function afterwards
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:99
Aug 17 08:16:57.838: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by ordering clean reboot and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:65
Aug 17 08:17:23.058: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by dropping all inbound packets for a while and ensure they function afterwards
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:91
Aug 17 08:17:48.247: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by ordering unclean reboot and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:71
Aug 17 08:18:13.443: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by triggering kernel panic and ensure they function upon restart
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:77
Aug 17 08:18:38.648: Node e2e-reboot-minion-6ovw failed reboot test.
Reboot each node by switching off the network interface and ensure they function upon switch on
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/reboot.go:83
Aug 17 08:16:32.647: Node e2e-reboot-minion-6ovw failed reboot test. | non_priority | multiple reboot tests failing consistently with node failed reboot test e g kubernetes gce reboot identified problems reboot each node by dropping all outbound packets for a while and ensure they function afterwards go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test reboot each node by ordering clean reboot and ensure they function upon restart go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test reboot each node by dropping all inbound packets for a while and ensure they function afterwards go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test reboot each node by ordering unclean reboot and ensure they function upon restart go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test reboot each node by triggering kernel panic and ensure they function upon restart go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test reboot each node by switching off the network interface and ensure they function upon switch on go src io kubernetes output dockerized go src io kubernetes test reboot go aug node reboot minion failed reboot test | 0 |
391,617 | 11,576,283,978 | IssuesEvent | 2020-02-21 11:34:47 | fac18/safe-space | https://api.github.com/repos/fac18/safe-space | closed | Feedback page - confirm submission of report | E2 high priority user story | - [ ] unique reference number
- [ ] As a user I want to know my information as been securely submitted | 1.0 | Feedback page - confirm submission of report - - [ ] unique reference number
- [ ] As a user I want to know my information as been securely submitted | priority | feedback page confirm submission of report unique reference number as a user i want to know my information as been securely submitted | 1 |
99,573 | 12,440,633,255 | IssuesEvent | 2020-05-26 12:21:27 | markovmodel/PyEMMA | https://api.github.com/repos/markovmodel/PyEMMA | closed | [opinion] dynamic caching is more viable than counting memory consumption | design-discussion wontfix | The implementation of the 'operate in memory' function in the coordinate module is complicated by the fact that it is impossible to reliably estimate the memory consumption in the transfromers. Each transformer calls one or many numpy or mdtraj functions each of which may allocate dynamic memory. Those allocations are rarely documentated and we are forced to inspect the source code of the libraries. These allocations can be of the same size as the chunk size.
Additionally the application runs in a multitasking operating system. So querying the amount of free memory and subtracting the amout of memory that our application is expected to consume is no reliable estimate for the free computer memory. Opening a tab in the Firefox browser may suddenly decrease the free memory by dozens of megabytes. Combined with our memory-greedy strategy of putting as many data into the RAM as possible this is a recipe for disaster.
One solution to this problem is that our application should have the abilty to 'back off' an return memory to the operating system in times of memory shortage. All operating systems have an API to report memory shortage to programs. We could interface this API and replace the 'operate in memory' by a data cache. When there is no memory shortage, all data is held in memory. In case of shortage, the cache is emptied and some chunks have to be recomputed.
Here are some possible guidelines for the implemention of such a cache:
- Transformers should not attempt to allocate one huge block of memory that will hold all data.
Instead they should allocate a cache page for every chunk they recieve from their data producer. If there is no memory shortage, the transfomer can recover all the data from the cache.
Example:
``` Python
# caching a chunk
cache.insert(traj_number,frame_number,chunk)
```
- Cache pages that are not used at the moment can 'go away'. If the transfomer tries to fetch a chunk from the cache that has gone away, a KeyError is raised. The transfomer can then react to this by requesting the chunk again from its data producer.
Example:
``` Python
# fetching a chunk
try:
chunk = cache.fetch(traj_number,frame_number)
except KeyError:
# cache miss, get chunk from data producer
```
- Deletion of cache pages happens as a rection to memory shortage. Memory shortage is reported directly from the operating system, see https://www.kernel.org/doc/Documentation/cgroups/cgroups.txt http://www.newosxbook.com/articles/MemoryPressure.html
In response, the cache can delete chunks that are currently not used. Whether a page is currently in use can be found out by checking the refcount of the numpy array which holds the cached chunk.
The operating system notifications are received in a second (Python) thread. The cache is protected with a lock to prevent race conditions.
- On memory shortage, the cache attepts to delete those pages first, that have not been accessed for the longest time. The oldest chunks are typically the following:
- chunks in chains that are not used but where the user still holds a reference
- chunks from the first steps of a pipeline. When caching is enabled, these chunks are created first and are only accessed once or twice when the next transformer in the chain is parametrized and run. Therfore these are the oldest cached chunks.
- The first frames from the first trajectories. They are always accessed before the chunks from the end of the trajectory. These chunks are easy to recover, because each transformer can reset its data producer and query the first chunks from the first trajectories again.
So this simplest form of caching which is based on access times, should give good performance when combined with the transformation pipeline.
| 1.0 | [opinion] dynamic caching is more viable than counting memory consumption - The implementation of the 'operate in memory' function in the coordinate module is complicated by the fact that it is impossible to reliably estimate the memory consumption in the transfromers. Each transformer calls one or many numpy or mdtraj functions each of which may allocate dynamic memory. Those allocations are rarely documentated and we are forced to inspect the source code of the libraries. These allocations can be of the same size as the chunk size.
Additionally the application runs in a multitasking operating system. So querying the amount of free memory and subtracting the amout of memory that our application is expected to consume is no reliable estimate for the free computer memory. Opening a tab in the Firefox browser may suddenly decrease the free memory by dozens of megabytes. Combined with our memory-greedy strategy of putting as many data into the RAM as possible this is a recipe for disaster.
One solution to this problem is that our application should have the abilty to 'back off' an return memory to the operating system in times of memory shortage. All operating systems have an API to report memory shortage to programs. We could interface this API and replace the 'operate in memory' by a data cache. When there is no memory shortage, all data is held in memory. In case of shortage, the cache is emptied and some chunks have to be recomputed.
Here are some possible guidelines for the implemention of such a cache:
- Transformers should not attempt to allocate one huge block of memory that will hold all data.
Instead they should allocate a cache page for every chunk they recieve from their data producer. If there is no memory shortage, the transfomer can recover all the data from the cache.
Example:
``` Python
# caching a chunk
cache.insert(traj_number,frame_number,chunk)
```
- Cache pages that are not used at the moment can 'go away'. If the transfomer tries to fetch a chunk from the cache that has gone away, a KeyError is raised. The transfomer can then react to this by requesting the chunk again from its data producer.
Example:
``` Python
# fetching a chunk
try:
chunk = cache.fetch(traj_number,frame_number)
except KeyError:
# cache miss, get chunk from data producer
```
- Deletion of cache pages happens as a rection to memory shortage. Memory shortage is reported directly from the operating system, see https://www.kernel.org/doc/Documentation/cgroups/cgroups.txt http://www.newosxbook.com/articles/MemoryPressure.html
In response, the cache can delete chunks that are currently not used. Whether a page is currently in use can be found out by checking the refcount of the numpy array which holds the cached chunk.
The operating system notifications are received in a second (Python) thread. The cache is protected with a lock to prevent race conditions.
- On memory shortage, the cache attepts to delete those pages first, that have not been accessed for the longest time. The oldest chunks are typically the following:
- chunks in chains that are not used but where the user still holds a reference
- chunks from the first steps of a pipeline. When caching is enabled, these chunks are created first and are only accessed once or twice when the next transformer in the chain is parametrized and run. Therfore these are the oldest cached chunks.
- The first frames from the first trajectories. They are always accessed before the chunks from the end of the trajectory. These chunks are easy to recover, because each transformer can reset its data producer and query the first chunks from the first trajectories again.
So this simplest form of caching which is based on access times, should give good performance when combined with the transformation pipeline.
| non_priority | dynamic caching is more viable than counting memory consumption the implementation of the operate in memory function in the coordinate module is complicated by the fact that it is impossible to reliably estimate the memory consumption in the transfromers each transformer calls one or many numpy or mdtraj functions each of which may allocate dynamic memory those allocations are rarely documentated and we are forced to inspect the source code of the libraries these allocations can be of the same size as the chunk size additionally the application runs in a multitasking operating system so querying the amount of free memory and subtracting the amout of memory that our application is expected to consume is no reliable estimate for the free computer memory opening a tab in the firefox browser may suddenly decrease the free memory by dozens of megabytes combined with our memory greedy strategy of putting as many data into the ram as possible this is a recipe for disaster one solution to this problem is that our application should have the abilty to back off an return memory to the operating system in times of memory shortage all operating systems have an api to report memory shortage to programs we could interface this api and replace the operate in memory by a data cache when there is no memory shortage all data is held in memory in case of shortage the cache is emptied and some chunks have to be recomputed here are some possible guidelines for the implemention of such a cache transformers should not attempt to allocate one huge block of memory that will hold all data instead they should allocate a cache page for every chunk they recieve from their data producer if there is no memory shortage the transfomer can recover all the data from the cache example python caching a chunk cache insert traj number frame number chunk cache pages that are not used at the moment can go away if the transfomer tries to fetch a chunk from the cache that has gone away a keyerror is raised the transfomer can then react to this by requesting the chunk again from its data producer example python fetching a chunk try chunk cache fetch traj number frame number except keyerror cache miss get chunk from data producer deletion of cache pages happens as a rection to memory shortage memory shortage is reported directly from the operating system see in response the cache can delete chunks that are currently not used whether a page is currently in use can be found out by checking the refcount of the numpy array which holds the cached chunk the operating system notifications are received in a second python thread the cache is protected with a lock to prevent race conditions on memory shortage the cache attepts to delete those pages first that have not been accessed for the longest time the oldest chunks are typically the following chunks in chains that are not used but where the user still holds a reference chunks from the first steps of a pipeline when caching is enabled these chunks are created first and are only accessed once or twice when the next transformer in the chain is parametrized and run therfore these are the oldest cached chunks the first frames from the first trajectories they are always accessed before the chunks from the end of the trajectory these chunks are easy to recover because each transformer can reset its data producer and query the first chunks from the first trajectories again so this simplest form of caching which is based on access times should give good performance when combined with the transformation pipeline | 0 |
628,680 | 20,010,627,887 | IssuesEvent | 2022-02-01 05:42:57 | geoff-maddock/events-tracker | https://api.github.com/repos/geoff-maddock/events-tracker | closed | Design - visual flourish | style low priority | Look at codepen.io for some ideas of how to build svg or other icons or animations for the site | 1.0 | Design - visual flourish - Look at codepen.io for some ideas of how to build svg or other icons or animations for the site | priority | design visual flourish look at codepen io for some ideas of how to build svg or other icons or animations for the site | 1 |
127,952 | 5,040,984,741 | IssuesEvent | 2016-12-19 08:38:21 | steedos/apps | https://api.github.com/repos/steedos/apps | closed | 系统关闭时记录最近打开的URL,下次登录后自动进入对应的URL | fix:Done priority:High | URL 保存在 localStorage 中,
localStorage.setItem("Steedos.lastURL") | 1.0 | 系统关闭时记录最近打开的URL,下次登录后自动进入对应的URL - URL 保存在 localStorage 中,
localStorage.setItem("Steedos.lastURL") | priority | 系统关闭时记录最近打开的url,下次登录后自动进入对应的url url 保存在 localstorage 中, localstorage setitem steedos lasturl | 1 |
615,297 | 19,252,684,998 | IssuesEvent | 2021-12-09 07:53:20 | projectdiscovery/nuclei | https://api.github.com/repos/projectdiscovery/nuclei | closed | Templates on windows unzipping in the root folder | Priority: High Status: Completed Type: Bug | <!--
1. Please search to see if an issue already exists for the bug you encountered.
2. For support requests, FAQs or "How to" questions, please use the GitHub Discussions section instead - https://github.com/projectdiscovery/nuclei/discussions or
3. Join our discord server at https://discord.gg/projectdiscovery and post the question on the #nuclei channel.
-->
<!-- ISSUES MISSING IMPORTANT INFORMATION MAY BE CLOSED WITHOUT INVESTIGATION. -->
### Nuclei version:
v2.5.4
### Current Behavior:

### Expected Behavior:
Templates are installed following the same directory structure
### Steps To Reproduce:
```
nuclei.exe -ut
```
### Anything else:
<!-- Links? References? Screenshots? Anything that will give us more context about the issue that you are encountering! -->
| 1.0 | Templates on windows unzipping in the root folder - <!--
1. Please search to see if an issue already exists for the bug you encountered.
2. For support requests, FAQs or "How to" questions, please use the GitHub Discussions section instead - https://github.com/projectdiscovery/nuclei/discussions or
3. Join our discord server at https://discord.gg/projectdiscovery and post the question on the #nuclei channel.
-->
<!-- ISSUES MISSING IMPORTANT INFORMATION MAY BE CLOSED WITHOUT INVESTIGATION. -->
### Nuclei version:
v2.5.4
### Current Behavior:

### Expected Behavior:
Templates are installed following the same directory structure
### Steps To Reproduce:
```
nuclei.exe -ut
```
### Anything else:
<!-- Links? References? Screenshots? Anything that will give us more context about the issue that you are encountering! -->
| priority | templates on windows unzipping in the root folder please search to see if an issue already exists for the bug you encountered for support requests faqs or how to questions please use the github discussions section instead or join our discord server at and post the question on the nuclei channel nuclei version current behavior expected behavior templates are installed following the same directory structure steps to reproduce nuclei exe ut anything else | 1 |
328,919 | 10,001,471,958 | IssuesEvent | 2019-07-12 15:42:28 | phetsims/QA | https://api.github.com/repos/phetsims/QA | closed | Test behavior of Talkback | QA:a11y priority:3-medium | We currently don't support Talkback (Google screen reader for Android) but we hope to in the future. We would like to know how well it behaves in general and with our sims.
We can use these two sims to test:
[Gravity Force Lab: Basics](https://phet-dev.colorado.edu/html/gravity-force-lab-basics/1.0.0-dev.39/phet/gravity-force-lab-basics_all_phet.html)
[Ohm's Law](https://phet-dev.colorado.edu/html/ohms-law/1.4.0/phet/ohms-law_en_phet.html)
It would be helpful to at least start this over a zoom call so I can see the behavior and make sure there isn't anything glaring that would make it impossible to test.
Assigning to @KatieWoe and myself to coordinate. | 1.0 | Test behavior of Talkback - We currently don't support Talkback (Google screen reader for Android) but we hope to in the future. We would like to know how well it behaves in general and with our sims.
We can use these two sims to test:
[Gravity Force Lab: Basics](https://phet-dev.colorado.edu/html/gravity-force-lab-basics/1.0.0-dev.39/phet/gravity-force-lab-basics_all_phet.html)
[Ohm's Law](https://phet-dev.colorado.edu/html/ohms-law/1.4.0/phet/ohms-law_en_phet.html)
It would be helpful to at least start this over a zoom call so I can see the behavior and make sure there isn't anything glaring that would make it impossible to test.
Assigning to @KatieWoe and myself to coordinate. | priority | test behavior of talkback we currently don t support talkback google screen reader for android but we hope to in the future we would like to know how well it behaves in general and with our sims we can use these two sims to test it would be helpful to at least start this over a zoom call so i can see the behavior and make sure there isn t anything glaring that would make it impossible to test assigning to katiewoe and myself to coordinate | 1 |
475,561 | 13,722,679,226 | IssuesEvent | 2020-10-03 05:04:35 | kubeflow/manifests | https://api.github.com/repos/kubeflow/manifests | closed | kf-serving gateway for v1.0.0 fails to build with kfctl | area/kfctl kind/bug lifecycle/stale priority/p1 | https://github.com/kubeflow/manifests/pull/949 was recently merged into master for a `kf-serving gateway` that fixes `kf-serving` problems. However, `kfctl build` fails with the `gcp_iap_v1.0.0.yaml` due to the `v1.0.0` branch not also being updated with the PR. But It's possible I'm making a mistake here.
Any help is much appreciated, and I'm grateful for the work @krishnadurai has put into this already. | 1.0 | kf-serving gateway for v1.0.0 fails to build with kfctl - https://github.com/kubeflow/manifests/pull/949 was recently merged into master for a `kf-serving gateway` that fixes `kf-serving` problems. However, `kfctl build` fails with the `gcp_iap_v1.0.0.yaml` due to the `v1.0.0` branch not also being updated with the PR. But It's possible I'm making a mistake here.
Any help is much appreciated, and I'm grateful for the work @krishnadurai has put into this already. | priority | kf serving gateway for fails to build with kfctl was recently merged into master for a kf serving gateway that fixes kf serving problems however kfctl build fails with the gcp iap yaml due to the branch not also being updated with the pr but it s possible i m making a mistake here any help is much appreciated and i m grateful for the work krishnadurai has put into this already | 1 |
544,916 | 15,931,381,065 | IssuesEvent | 2021-04-14 03:13:25 | davidsaulrodriguez/shop-portal | https://api.github.com/repos/davidsaulrodriguez/shop-portal | opened | 🚀 - Set up basic stripe | effort: 3 priority: now type: feature request work: complicated | <!--⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️
Oh hi there! 😏
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️-->
# 🚀 feature request
Adding basic code for stripe
| 1.0 | 🚀 - Set up basic stripe - <!--⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️
Oh hi there! 😏
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️-->
# 🚀 feature request
Adding basic code for stripe
| priority | 🚀 set up basic stripe ⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️ oh hi there 😏 to expedite issue processing please search open and closed issues before submitting a new one existing issues often contain information about workarounds resolution or progress updates ⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️⚠️ 🚀 feature request adding basic code for stripe | 1 |
264,158 | 23,099,669,196 | IssuesEvent | 2022-07-27 00:24:11 | MPMG-DCC-UFMG/F01 | https://api.github.com/repos/MPMG-DCC-UFMG/F01 | closed | Teste de generalizacao para a tag Servidores - Registro da remuneração - Couto de Magalhães de Minas | generalization test development template-Síntese tecnologia informatica subtag-Dados de Remuneração tag-Servidores | DoD: Realizar o teste de Generalização do validador da tag Servidores - Registro da remuneração para o Município de Couto de Magalhães de Minas. | 1.0 | Teste de generalizacao para a tag Servidores - Registro da remuneração - Couto de Magalhães de Minas - DoD: Realizar o teste de Generalização do validador da tag Servidores - Registro da remuneração para o Município de Couto de Magalhães de Minas. | non_priority | teste de generalizacao para a tag servidores registro da remuneração couto de magalhães de minas dod realizar o teste de generalização do validador da tag servidores registro da remuneração para o município de couto de magalhães de minas | 0 |
52,267 | 13,731,656,072 | IssuesEvent | 2020-10-05 01:52:58 | jtimberlake/griffin | https://api.github.com/repos/jtimberlake/griffin | opened | WS-2019-0493 (High) detected in handlebars-1.3.0.tgz | security vulnerability | ## WS-2019-0493 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-1.3.0.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p>
<p>Path to dependency file: griffin/ui/angular/package.json</p>
<p>Path to vulnerable library: griffin/ui/angular/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- cli-1.3.0.tgz (Root Library)
- postcss-url-5.1.2.tgz
- directory-encoder-0.7.2.tgz
- :x: **handlebars-1.3.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
handlebars before 3.0.8 and 4.x before 4.5.2 is vulnerable to Arbitrary Code Execution. The package's lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript in the system.
<p>Publish Date: 2019-11-14
<p>URL: <a href=https://github.com/handlebars-lang/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0493</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p>
<p>Release Date: 2019-11-14</p>
<p>Fix Resolution: handlebars - 3.0.8,4.5.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"1.3.0","isTransitiveDependency":true,"dependencyTree":"@angular/cli:1.3.0;postcss-url:5.1.2;directory-encoder:0.7.2;handlebars:1.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 3.0.8,4.5.2"}],"vulnerabilityIdentifier":"WS-2019-0493","vulnerabilityDetails":"handlebars before 3.0.8 and 4.x before 4.5.2 is vulnerable to Arbitrary Code Execution. The package\u0027s lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript in the system.","vulnerabilityUrl":"https://github.com/handlebars-lang/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | WS-2019-0493 (High) detected in handlebars-1.3.0.tgz - ## WS-2019-0493 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-1.3.0.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p>
<p>Path to dependency file: griffin/ui/angular/package.json</p>
<p>Path to vulnerable library: griffin/ui/angular/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- cli-1.3.0.tgz (Root Library)
- postcss-url-5.1.2.tgz
- directory-encoder-0.7.2.tgz
- :x: **handlebars-1.3.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
handlebars before 3.0.8 and 4.x before 4.5.2 is vulnerable to Arbitrary Code Execution. The package's lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript in the system.
<p>Publish Date: 2019-11-14
<p>URL: <a href=https://github.com/handlebars-lang/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0493</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p>
<p>Release Date: 2019-11-14</p>
<p>Fix Resolution: handlebars - 3.0.8,4.5.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"1.3.0","isTransitiveDependency":true,"dependencyTree":"@angular/cli:1.3.0;postcss-url:5.1.2;directory-encoder:0.7.2;handlebars:1.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 3.0.8,4.5.2"}],"vulnerabilityIdentifier":"WS-2019-0493","vulnerabilityDetails":"handlebars before 3.0.8 and 4.x before 4.5.2 is vulnerable to Arbitrary Code Execution. The package\u0027s lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript in the system.","vulnerabilityUrl":"https://github.com/handlebars-lang/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | ws high detected in handlebars tgz ws high severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file griffin ui angular package json path to vulnerable library griffin ui angular node modules handlebars package json dependency hierarchy cli tgz root library postcss url tgz directory encoder tgz x handlebars tgz vulnerable library vulnerability details handlebars before and x before is vulnerable to arbitrary code execution the package s lookup helper fails to properly validate templates allowing attackers to submit templates that execute arbitrary javascript in the system publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier ws vulnerabilitydetails handlebars before and x before is vulnerable to arbitrary code execution the package lookup helper fails to properly validate templates allowing attackers to submit templates that execute arbitrary javascript in the system vulnerabilityurl | 0 |
102,872 | 11,308,881,520 | IssuesEvent | 2020-01-19 09:18:02 | npee/RodinProject | https://api.github.com/repos/npee/RodinProject | opened | 데스크탑에도 환경 만들기 | documentation | # 환경설정
## 개발 도구
- PyCharm 2019.3.1
## 라이브러리
---
### 머신러닝용
- opencv 4.2.0
- tensorflow 1.12
- pillow 7.0.0
---
### 서버용
- tornado 6.0.3 | 1.0 | 데스크탑에도 환경 만들기 - # 환경설정
## 개발 도구
- PyCharm 2019.3.1
## 라이브러리
---
### 머신러닝용
- opencv 4.2.0
- tensorflow 1.12
- pillow 7.0.0
---
### 서버용
- tornado 6.0.3 | non_priority | 데스크탑에도 환경 만들기 환경설정 개발 도구 pycharm 라이브러리 머신러닝용 opencv tensorflow pillow 서버용 tornado | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.