Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
15,481
| 19,688,662,404
|
IssuesEvent
|
2022-01-12 02:44:26
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Processing modeler : Panels position / visibility is not kept
|
Feedback stale Processing GUI/UX Bug Modeller
|
**Describe the bug**
Panel position/visibilty is reset when reopening the processing modeler Window (this is annoying esp. on small screen where you may want to hide less useful panels such as undo history).
**How to Reproduce**
1. Open the processing modeler
2. Hide the undo history widget
3. Close the processing modeler
4. Open the processing modeler -> the undo history widget has reappeared
**QGIS and OS versions**
```
QGIS version
3.14.0-Pi
QGIS code revision
9f7028fd23
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
pluginbuilder3;
plugin_reloader;
qgepplugin;
QGIS3-getWKT;
swiss_locator;
db_manager;
processing
```
|
1.0
|
Processing modeler : Panels position / visibility is not kept - **Describe the bug**
Panel position/visibilty is reset when reopening the processing modeler Window (this is annoying esp. on small screen where you may want to hide less useful panels such as undo history).
**How to Reproduce**
1. Open the processing modeler
2. Hide the undo history widget
3. Close the processing modeler
4. Open the processing modeler -> the undo history widget has reappeared
**QGIS and OS versions**
```
QGIS version
3.14.0-Pi
QGIS code revision
9f7028fd23
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
pluginbuilder3;
plugin_reloader;
qgepplugin;
QGIS3-getWKT;
swiss_locator;
db_manager;
processing
```
|
process
|
processing modeler panels position visibility is not kept describe the bug panel position visibilty is reset when reopening the processing modeler window this is annoying esp on small screen where you may want to hide less useful panels such as undo history how to reproduce open the processing modeler hide the undo history widget close the processing modeler open the processing modeler the undo history widget has reappeared qgis and os versions qgis version pi qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins plugin reloader qgepplugin getwkt swiss locator db manager processing
| 1
|
9,469
| 12,465,367,814
|
IssuesEvent
|
2020-05-28 13:56:57
|
Devnilson/fisima
|
https://api.github.com/repos/Devnilson/fisima
|
closed
|
Create styleguide and linting/prettier configuration
|
process
|
Code styleguide
Version styleguide 0ver -> https://0ver.org/ (0.X.0) for breaking change, (0.current.X) non-breaking change
Commit style -> Semantic "feat: XXXX" "build: XXX", "chore: XXXX"
|
1.0
|
Create styleguide and linting/prettier configuration - Code styleguide
Version styleguide 0ver -> https://0ver.org/ (0.X.0) for breaking change, (0.current.X) non-breaking change
Commit style -> Semantic "feat: XXXX" "build: XXX", "chore: XXXX"
|
process
|
create styleguide and linting prettier configuration code styleguide version styleguide x for breaking change current x non breaking change commit style semantic feat xxxx build xxx chore xxxx
| 1
|
130,123
| 18,022,947,211
|
IssuesEvent
|
2021-09-16 22:11:29
|
dotnet/roslyn
|
https://api.github.com/repos/dotnet/roslyn
|
closed
|
RZ10012 warning cannot be upgraded to error in .editorconfig
|
Bug Area-Analyzers Resolution-By Design New Feature - Source Generators
|
### Describe the bug
When adding the line
```
dotnet_diagnostic.RZ10012.severity = error
```
to `.editorconfig`, I would expect the compiler to report the following to report the razor markup
```
<NonExistentComponent />
```
as an RZ10012 error, but it only reports it as a warning.
|
1.0
|
RZ10012 warning cannot be upgraded to error in .editorconfig - ### Describe the bug
When adding the line
```
dotnet_diagnostic.RZ10012.severity = error
```
to `.editorconfig`, I would expect the compiler to report the following to report the razor markup
```
<NonExistentComponent />
```
as an RZ10012 error, but it only reports it as a warning.
|
non_process
|
warning cannot be upgraded to error in editorconfig describe the bug when adding the line dotnet diagnostic severity error to editorconfig i would expect the compiler to report the following to report the razor markup as an error but it only reports it as a warning
| 0
|
437,465
| 12,598,048,244
|
IssuesEvent
|
2020-06-11 01:47:34
|
WarEmu/WarBugs
|
https://api.github.com/repos/WarEmu/WarBugs
|
closed
|
Officer Coin items pricing
|
Database High Priority
|
<!--
Issues should be unique. Check if someone else reported
the issue first, and please don't report duplicates.
Only ONE issue in a report. Don't forget screens or a video.
-->
**Expected behavior and actual behavior:**
the pricing on the 200 Crafting bottle and 200 talisman box are very different from two different vendors. At Erwin Kruger (near town square) the 200 bottle is 100 officer coins, and the talisman box is 400 officer coins... But if you go to Al Azrae (officer coin vendor) the 200 bottle is listed at 10 officer coins and the 200 talisman box is listed at 40 officer coins.
**Steps to reproduce the problem:**
Open each vender
**Testing Screenshots/Videos/Evidences (always needed):**
<!-- Drag and drop an image file here to include it directly in the bug report,
no need to upload it to another site -->


<!--
Note that game critical and game breaking bugs may award a manticore/griffon (realm specific) at the leads discretion however, asking for one instantly disqualifies you from this reward.
-->
|
1.0
|
Officer Coin items pricing - <!--
Issues should be unique. Check if someone else reported
the issue first, and please don't report duplicates.
Only ONE issue in a report. Don't forget screens or a video.
-->
**Expected behavior and actual behavior:**
the pricing on the 200 Crafting bottle and 200 talisman box are very different from two different vendors. At Erwin Kruger (near town square) the 200 bottle is 100 officer coins, and the talisman box is 400 officer coins... But if you go to Al Azrae (officer coin vendor) the 200 bottle is listed at 10 officer coins and the 200 talisman box is listed at 40 officer coins.
**Steps to reproduce the problem:**
Open each vender
**Testing Screenshots/Videos/Evidences (always needed):**
<!-- Drag and drop an image file here to include it directly in the bug report,
no need to upload it to another site -->


<!--
Note that game critical and game breaking bugs may award a manticore/griffon (realm specific) at the leads discretion however, asking for one instantly disqualifies you from this reward.
-->
|
non_process
|
officer coin items pricing issues should be unique check if someone else reported the issue first and please don t report duplicates only one issue in a report don t forget screens or a video expected behavior and actual behavior the pricing on the crafting bottle and talisman box are very different from two different vendors at erwin kruger near town square the bottle is officer coins and the talisman box is officer coins but if you go to al azrae officer coin vendor the bottle is listed at officer coins and the talisman box is listed at officer coins steps to reproduce the problem open each vender testing screenshots videos evidences always needed drag and drop an image file here to include it directly in the bug report no need to upload it to another site note that game critical and game breaking bugs may award a manticore griffon realm specific at the leads discretion however asking for one instantly disqualifies you from this reward
| 0
|
11,184
| 13,957,696,604
|
IssuesEvent
|
2020-10-24 08:11:53
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
ES: Harvesting of Spain
|
ES - Spain Geoportal Harvesting process
|
Dear Angelo,
we have seen that the last harvesting to spanish node was generated 23 October of 2018 .
Why harvesting proccess are not implementing?
¿are there problems to harvesting the metadata?
Best Regards
Alejandra
|
1.0
|
ES: Harvesting of Spain - Dear Angelo,
we have seen that the last harvesting to spanish node was generated 23 October of 2018 .
Why harvesting proccess are not implementing?
¿are there problems to harvesting the metadata?
Best Regards
Alejandra
|
process
|
es harvesting of spain dear angelo we have seen that the last harvesting to spanish node was generated october of why harvesting proccess are not implementing iquest are there problems to harvesting the metadata best regards alejandra
| 1
|
2,062
| 4,865,910,710
|
IssuesEvent
|
2016-11-14 22:07:01
|
Sage-Bionetworks/Genie
|
https://api.github.com/repos/Sage-Bionetworks/Genie
|
closed
|
GRCC mutation file can't be processed
|
data processing GRCC
|
Fixed with new v.1.6.11 vcf2maf. Empty vcf files no longer throw an error
|
1.0
|
GRCC mutation file can't be processed - Fixed with new v.1.6.11 vcf2maf. Empty vcf files no longer throw an error
|
process
|
grcc mutation file can t be processed fixed with new v empty vcf files no longer throw an error
| 1
|
21,163
| 28,136,771,621
|
IssuesEvent
|
2023-04-01 13:22:59
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
reopened
|
[C++] Nightly Integration Testing Report for Firestore
|
type: process nightly-testing
|
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @sunmou99 on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Sat Apr 1 04:48 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4582570173)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Fri Mar 31 09:55 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4574922030)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against tip] Integration test succeeded!
Requested by @sunmou99 on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Sat Apr 1 04:39 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4582793169)**
|
1.0
|
[C++] Nightly Integration Testing Report for Firestore -
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @sunmou99 on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Sat Apr 1 04:48 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4582570173)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Fri Mar 31 09:55 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4574922030)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against tip] Integration test succeeded!
Requested by @sunmou99 on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8
Last updated: Sat Apr 1 04:39 PDT 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4582793169)**
|
process
|
nightly integration testing report for firestore ✅ nbsp integration test succeeded requested by on commit last updated sat apr pdt ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated fri mar pdt ✅ nbsp integration test succeeded requested by on commit last updated sat apr pdt
| 1
|
39,947
| 6,787,379,579
|
IssuesEvent
|
2017-10-31 03:42:10
|
nhibernate/nhibernate-core
|
https://api.github.com/repos/nhibernate/nhibernate-core
|
closed
|
Lack of custom logging documentation (was Serilog Tracing)
|
c: Documentation
|
How can I configure Serilog for logging NHibernate?
|
1.0
|
Lack of custom logging documentation (was Serilog Tracing) - How can I configure Serilog for logging NHibernate?
|
non_process
|
lack of custom logging documentation was serilog tracing how can i configure serilog for logging nhibernate
| 0
|
18,194
| 24,243,072,998
|
IssuesEvent
|
2022-09-27 08:24:21
|
benthosdev/benthos
|
https://api.github.com/repos/benthosdev/benthos
|
closed
|
awk file operations
|
question processors
|
I want to append content to specific file in awk processor. Just like this:
```yaml
processors:
- awk:
codec: "json"
program: |
{
data_value = json_get("items.data")
print data_value >> ./result.json #It's wrong to write
}
```
How should this be achieved?
|
1.0
|
awk file operations - I want to append content to specific file in awk processor. Just like this:
```yaml
processors:
- awk:
codec: "json"
program: |
{
data_value = json_get("items.data")
print data_value >> ./result.json #It's wrong to write
}
```
How should this be achieved?
|
process
|
awk file operations i want to append content to specific file in awk processor just like this yaml processors awk codec json program data value json get items data print data value result json it s wrong to write how should this be achieved
| 1
|
116,752
| 24,983,928,076
|
IssuesEvent
|
2022-11-02 13:49:05
|
nmrih/source-game
|
https://api.github.com/repos/nmrih/source-game
|
closed
|
[public-1.13.0] "No zombie respawn" mutator gets overwritten by Nightmare difficulty
|
Status: Reviewed Type: Code Priority: Minimal
|
The mutator works on Classic and Casual difficulties, however on Nightmare it doesn't.
sv_spawn_regen_target is being forced at 0.6 and changing it to 0 has no effect on the spawns. This was tested on nmo_asylum.
|
1.0
|
[public-1.13.0] "No zombie respawn" mutator gets overwritten by Nightmare difficulty - The mutator works on Classic and Casual difficulties, however on Nightmare it doesn't.
sv_spawn_regen_target is being forced at 0.6 and changing it to 0 has no effect on the spawns. This was tested on nmo_asylum.
|
non_process
|
no zombie respawn mutator gets overwritten by nightmare difficulty the mutator works on classic and casual difficulties however on nightmare it doesn t sv spawn regen target is being forced at and changing it to has no effect on the spawns this was tested on nmo asylum
| 0
|
13,556
| 16,100,902,667
|
IssuesEvent
|
2021-04-27 09:09:26
|
Open-EO/openeo-processes
|
https://api.github.com/repos/Open-EO/openeo-processes
|
opened
|
Get data type for value
|
new process
|
It seems it's useful to have a `type` function that returns the data type of the value given.
A first use case is given here: https://github.com/Open-EO/openeo-python-driver/pull/64#issuecomment-827375828
The question for me is whether to let the function work on types or subtypes.
Types would just be: object, array, integer, number, boolean, null (JSON Schema basically, with no way to distinguish between for example vector-cubes or raster-cubes)
Subtypes would be all types defined in meta/subtype-schemas.json, but that seems also pretty hard to achieve as for some types it's hard to determine what it is, especially subtypes for scalars.
Maybe the solution is to go with one of those options:
1. Detect only types, but allow detecting subtypes for objects (and arrays?) in the same process
2. Detect only types, but add an additional function to detect what kind of object something is
|
1.0
|
Get data type for value - It seems it's useful to have a `type` function that returns the data type of the value given.
A first use case is given here: https://github.com/Open-EO/openeo-python-driver/pull/64#issuecomment-827375828
The question for me is whether to let the function work on types or subtypes.
Types would just be: object, array, integer, number, boolean, null (JSON Schema basically, with no way to distinguish between for example vector-cubes or raster-cubes)
Subtypes would be all types defined in meta/subtype-schemas.json, but that seems also pretty hard to achieve as for some types it's hard to determine what it is, especially subtypes for scalars.
Maybe the solution is to go with one of those options:
1. Detect only types, but allow detecting subtypes for objects (and arrays?) in the same process
2. Detect only types, but add an additional function to detect what kind of object something is
|
process
|
get data type for value it seems it s useful to have a type function that returns the data type of the value given a first use case is given here the question for me is whether to let the function work on types or subtypes types would just be object array integer number boolean null json schema basically with no way to distinguish between for example vector cubes or raster cubes subtypes would be all types defined in meta subtype schemas json but that seems also pretty hard to achieve as for some types it s hard to determine what it is especially subtypes for scalars maybe the solution is to go with one of those options detect only types but allow detecting subtypes for objects and arrays in the same process detect only types but add an additional function to detect what kind of object something is
| 1
|
72,971
| 9,634,269,958
|
IssuesEvent
|
2019-05-15 20:47:59
|
DlfinBroom/ChatBot
|
https://api.github.com/repos/DlfinBroom/ChatBot
|
opened
|
Add Regions to the entire project
|
Documentation
|
made regions for the entire project to help organize the projects code
|
1.0
|
Add Regions to the entire project - made regions for the entire project to help organize the projects code
|
non_process
|
add regions to the entire project made regions for the entire project to help organize the projects code
| 0
|
115,576
| 9,805,323,805
|
IssuesEvent
|
2019-06-12 08:44:01
|
Students-of-the-city-of-Kostroma/Student-timetable
|
https://api.github.com/repos/Students-of-the-city-of-Kostroma/Student-timetable
|
opened
|
Исправить сбои в автотестах расположенных в файле UT_Insert_CUniversity
|
Auto test Script Unit test
|
Выявить причину сбоев в автотестах.
При необходимости исправить автотесты и сценарии
#496 Script
|
2.0
|
Исправить сбои в автотестах расположенных в файле UT_Insert_CUniversity - Выявить причину сбоев в автотестах.
При необходимости исправить автотесты и сценарии
#496 Script
|
non_process
|
исправить сбои в автотестах расположенных в файле ut insert cuniversity выявить причину сбоев в автотестах при необходимости исправить автотесты и сценарии script
| 0
|
6,642
| 9,754,212,070
|
IssuesEvent
|
2019-06-04 11:01:24
|
googleapis/google-cloud-java
|
https://api.github.com/repos/googleapis/google-cloud-java
|
closed
|
Kokoro should run integration tests in examples
|
api: bigtable type: process
|
The cloud bigtable examples contain integration tests. It would be nice to setup kokoro to run them:
https://github.com/googleapis/google-cloud-java/tree/master/google-cloud-examples/src/test/java/com/google/cloud/examples/bigtable
@elisheva-qlogic can you provide the mvn command line you used to run the integration tests?
|
1.0
|
Kokoro should run integration tests in examples - The cloud bigtable examples contain integration tests. It would be nice to setup kokoro to run them:
https://github.com/googleapis/google-cloud-java/tree/master/google-cloud-examples/src/test/java/com/google/cloud/examples/bigtable
@elisheva-qlogic can you provide the mvn command line you used to run the integration tests?
|
process
|
kokoro should run integration tests in examples the cloud bigtable examples contain integration tests it would be nice to setup kokoro to run them elisheva qlogic can you provide the mvn command line you used to run the integration tests
| 1
|
20,760
| 27,493,187,027
|
IssuesEvent
|
2023-03-04 21:37:04
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Error shown for certain column types when getting results from cache
|
Type:Bug Priority:P2 Database/Redshift Querying/Processor .Backend Querying/Cache
|
**Describe the bug**
When returning an `interval` column type via cached results, then an error is shown in the frontend instead of the results.
I'm unsure if this is specific to Redshift or generally for unknown column types. I've not been able to reproduce on Postgres, though a similar, but different issue exists for Postgres #11995
Since `interval` is not fully supported (#2656), then that could be it, but then the error should also exist on Postgres ...?
**To Reproduce**
1. Admin > Settings > Caching: Duration=`0.0001` and TTL=`10000`
2. Native query > **Redshift** > `select interval '1 day'` - run query - see results - save question
3. Do browser refresh a couple of time, so it starts returning cached results (or is supposed to)
The returned "result" is:
`{"nippy/unthawable":{"type":"serializable","cause":"quarantined","class-name":"com.amazon.redshift.util.RedshiftInterval","content":"0xACED0005"}}`
And the log only contains the following related to this, no stacktraces or anything real errors:
`WARN metabase.driver.common Don't know how to map class 'class com.amazon.redshift.util.RedshiftInterval' to a Field base_type, falling back to :type/*.`

**Expected behavior**
1. Correct results from cache, of course
2. An error or stracktrace in the log.
**Information about your Metabase Installation:**
Tested 1.41.5 and master `b9bee5d`
|
1.0
|
Error shown for certain column types when getting results from cache - **Describe the bug**
When returning an `interval` column type via cached results, then an error is shown in the frontend instead of the results.
I'm unsure if this is specific to Redshift or generally for unknown column types. I've not been able to reproduce on Postgres, though a similar, but different issue exists for Postgres #11995
Since `interval` is not fully supported (#2656), then that could be it, but then the error should also exist on Postgres ...?
**To Reproduce**
1. Admin > Settings > Caching: Duration=`0.0001` and TTL=`10000`
2. Native query > **Redshift** > `select interval '1 day'` - run query - see results - save question
3. Do browser refresh a couple of time, so it starts returning cached results (or is supposed to)
The returned "result" is:
`{"nippy/unthawable":{"type":"serializable","cause":"quarantined","class-name":"com.amazon.redshift.util.RedshiftInterval","content":"0xACED0005"}}`
And the log only contains the following related to this, no stacktraces or anything real errors:
`WARN metabase.driver.common Don't know how to map class 'class com.amazon.redshift.util.RedshiftInterval' to a Field base_type, falling back to :type/*.`

**Expected behavior**
1. Correct results from cache, of course
2. An error or stracktrace in the log.
**Information about your Metabase Installation:**
Tested 1.41.5 and master `b9bee5d`
|
process
|
error shown for certain column types when getting results from cache describe the bug when returning an interval column type via cached results then an error is shown in the frontend instead of the results i m unsure if this is specific to redshift or generally for unknown column types i ve not been able to reproduce on postgres though a similar but different issue exists for postgres since interval is not fully supported then that could be it but then the error should also exist on postgres to reproduce admin settings caching duration and ttl native query redshift select interval day run query see results save question do browser refresh a couple of time so it starts returning cached results or is supposed to the returned result is nippy unthawable type serializable cause quarantined class name com amazon redshift util redshiftinterval content and the log only contains the following related to this no stacktraces or anything real errors warn metabase driver common don t know how to map class class com amazon redshift util redshiftinterval to a field base type falling back to type expected behavior correct results from cache of course an error or stracktrace in the log information about your metabase installation tested and master
| 1
|
4,288
| 7,190,681,899
|
IssuesEvent
|
2018-02-02 18:08:31
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
closed
|
queryBlock says it writes data to cache, it doesn't
|
libs-etherlib status-inprocess type-enhancement
|
The function queryBlock says it writes data to cache, but the data isn't written until getBlock (which calls into queryBlock) returns. This has the effect of the getBlock tool saying it wrote the data, when in reality it did not. Here's the offending code:
https://github.com/Great-Hill-Corporation/quickBlocks/blob/master/src/libs/etherlib/node.cpp#L223
Block data gets written to cache in one of two places: (a) in blockScrape, and (b) in getBlock but only with -o option.
Solution: revisit why I chose to not write the data to disc if it didn't already exist. To be honest, I don't think I specifically chose to do this.
Note: This may be a feature, not a bug, so consider carefully.
|
1.0
|
queryBlock says it writes data to cache, it doesn't - The function queryBlock says it writes data to cache, but the data isn't written until getBlock (which calls into queryBlock) returns. This has the effect of the getBlock tool saying it wrote the data, when in reality it did not. Here's the offending code:
https://github.com/Great-Hill-Corporation/quickBlocks/blob/master/src/libs/etherlib/node.cpp#L223
Block data gets written to cache in one of two places: (a) in blockScrape, and (b) in getBlock but only with -o option.
Solution: revisit why I chose to not write the data to disc if it didn't already exist. To be honest, I don't think I specifically chose to do this.
Note: This may be a feature, not a bug, so consider carefully.
|
process
|
queryblock says it writes data to cache it doesn t the function queryblock says it writes data to cache but the data isn t written until getblock which calls into queryblock returns this has the effect of the getblock tool saying it wrote the data when in reality it did not here s the offending code block data gets written to cache in one of two places a in blockscrape and b in getblock but only with o option solution revisit why i chose to not write the data to disc if it didn t already exist to be honest i don t think i specifically chose to do this note this may be a feature not a bug so consider carefully
| 1
|
13,709
| 16,469,094,132
|
IssuesEvent
|
2021-05-23 03:16:52
|
rdoddanavar/hpr-sim
|
https://api.github.com/repos/rdoddanavar/hpr-sim
|
closed
|
src/exec/exec.py --> Master script
|
pre-processing
|
Master python script that executes python & cpp subroutines. Takes command line argument(s) of input file paths
ex. (CLI) `$ python3 exec.py myinput.yaml`
|
1.0
|
src/exec/exec.py --> Master script - Master python script that executes python & cpp subroutines. Takes command line argument(s) of input file paths
ex. (CLI) `$ python3 exec.py myinput.yaml`
|
process
|
src exec exec py master script master python script that executes python cpp subroutines takes command line argument s of input file paths ex cli exec py myinput yaml
| 1
|
275,933
| 23,954,701,888
|
IssuesEvent
|
2022-09-12 14:10:08
|
mydumper/mydumper
|
https://api.github.com/repos/mydumper/mydumper
|
closed
|
[BUG] Job counter resets after WARNING is printed
|
bug test case needed
|
**Describe the bug**
After getting the warning `Too many jobs in the queue. We are pausing the jobs creation for 5 seconds`, the `Remaining jobs:` counter resets.
Example:
```
** Message: 21:02:30.903: Thread 4 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149348000001 AND `image_id` < 149349000001) | Remaining jobs: 199496
** Message: 21:02:30.907: Empty table contributorstats.image_stats
** Message: 21:02:30.907: Thread 1 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149349000001 AND `image_id` < 149350000001) | Remaining jobs: 199761
** (mydumper:20480): WARNING **: 21:02:30.908: Too many jobs in the queue. We are pausing the jobs creation for 5 seconds.
** Message: 21:02:30.909: Empty table contributorstats.image_stats
** Message: 21:02:30.909: Thread 2 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149350000001 AND `image_id` < 149351000001) | Remaining jobs: 200000
** Message: 21:02:30.909: Empty table contributorstats.image_stats
** Message: 21:02:30.909: Thread 3 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149351000001 AND `image_id` < 149352000001) | Remaining jobs: 199999
** Message: 21:02:30.916: Empty table contributorstats.image_stats
```
**To Reproduce**
`mydumper -h <source_instance> --regex ^<some_regex> -o <workdir> -R -G -r 10000000 -t 4 --set-names utf8mb4 -v 3`
What mydumper and myloader version has been used?
```
$ mydumper --version
mydumper 0.11.5, built against MySQL 5.7.34-37
```
**Environment (please complete the following information):**
Centos 7
|
1.0
|
[BUG] Job counter resets after WARNING is printed - **Describe the bug**
After getting the warning `Too many jobs in the queue. We are pausing the jobs creation for 5 seconds`, the `Remaining jobs:` counter resets.
Example:
```
** Message: 21:02:30.903: Thread 4 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149348000001 AND `image_id` < 149349000001) | Remaining jobs: 199496
** Message: 21:02:30.907: Empty table contributorstats.image_stats
** Message: 21:02:30.907: Thread 1 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149349000001 AND `image_id` < 149350000001) | Remaining jobs: 199761
** (mydumper:20480): WARNING **: 21:02:30.908: Too many jobs in the queue. We are pausing the jobs creation for 5 seconds.
** Message: 21:02:30.909: Empty table contributorstats.image_stats
** Message: 21:02:30.909: Thread 2 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149350000001 AND `image_id` < 149351000001) | Remaining jobs: 200000
** Message: 21:02:30.909: Empty table contributorstats.image_stats
** Message: 21:02:30.909: Thread 3 dumping data for `contributorstats`.`image_stats` WHERE (`image_id` >= 149351000001 AND `image_id` < 149352000001) | Remaining jobs: 199999
** Message: 21:02:30.916: Empty table contributorstats.image_stats
```
**To Reproduce**
`mydumper -h <source_instance> --regex ^<some_regex> -o <workdir> -R -G -r 10000000 -t 4 --set-names utf8mb4 -v 3`
What mydumper and myloader version has been used?
```
$ mydumper --version
mydumper 0.11.5, built against MySQL 5.7.34-37
```
**Environment (please complete the following information):**
Centos 7
|
non_process
|
job counter resets after warning is printed describe the bug after getting the warning too many jobs in the queue we are pausing the jobs creation for seconds the remaining jobs counter resets example message thread dumping data for contributorstats image stats where image id and image id remaining jobs message empty table contributorstats image stats message thread dumping data for contributorstats image stats where image id and image id remaining jobs mydumper warning too many jobs in the queue we are pausing the jobs creation for seconds message empty table contributorstats image stats message thread dumping data for contributorstats image stats where image id and image id remaining jobs message empty table contributorstats image stats message thread dumping data for contributorstats image stats where image id and image id remaining jobs message empty table contributorstats image stats to reproduce mydumper h regex o r g r t set names v what mydumper and myloader version has been used mydumper version mydumper built against mysql environment please complete the following information centos
| 0
|
910
| 3,371,996,626
|
IssuesEvent
|
2015-11-23 21:32:33
|
openconnectome/m2g
|
https://api.github.com/repos/openconnectome/m2g
|
closed
|
Corrupted graphs need to be rerun at some point
|
bug data processing
|
While running the parallel converter some graphs failed to be read -- I decided to assemble them here.
NKI-ENH_0186697_biggraphs.graphml
Jung2015_M87160886_biggraphs.graphml
Jung2015_M87164774_biggraphs.graphml
Jung2015_M87183242_biggraphs.graphml
Jung2015_M87109104_biggraphs.graphml
Jung2015_M87138290_biggraphs.graphml
M87164412-bg.graphml
M87111487-bg.graphml
@gkiar please reassign as necessary
|
1.0
|
Corrupted graphs need to be rerun at some point - While running the parallel converter some graphs failed to be read -- I decided to assemble them here.
NKI-ENH_0186697_biggraphs.graphml
Jung2015_M87160886_biggraphs.graphml
Jung2015_M87164774_biggraphs.graphml
Jung2015_M87183242_biggraphs.graphml
Jung2015_M87109104_biggraphs.graphml
Jung2015_M87138290_biggraphs.graphml
M87164412-bg.graphml
M87111487-bg.graphml
@gkiar please reassign as necessary
|
process
|
corrupted graphs need to be rerun at some point while running the parallel converter some graphs failed to be read i decided to assemble them here nki enh biggraphs graphml biggraphs graphml biggraphs graphml biggraphs graphml biggraphs graphml biggraphs graphml bg graphml bg graphml gkiar please reassign as necessary
| 1
|
19,742
| 26,098,321,430
|
IssuesEvent
|
2022-12-27 01:29:37
|
AssetRipper/AssetRipper
|
https://api.github.com/repos/AssetRipper/AssetRipper
|
closed
|
Assembly Processing
|
enhancement scripts processing
|
### Describe the new feature or enhancement
## Context
There is demand for an assembly processing system. This was briefly discussed in #633.
* Processing could enable unit tested script fixing for better decompilation.
* Method bodies could be easily stripped without concerns over stability.
* Any IL2Cpp fixes can be easily ported to Cpp2IL, where desirable.
## Design
This will function as an assembly processor that runs after asset deserialization but before asset processing. The interface might look like this:
```cs
interface IAssemblyProcessor
{
void Process(IAssemblyManager manager);
}
```
|
1.0
|
Assembly Processing - ### Describe the new feature or enhancement
## Context
There is demand for an assembly processing system. This was briefly discussed in #633.
* Processing could enable unit tested script fixing for better decompilation.
* Method bodies could be easily stripped without concerns over stability.
* Any IL2Cpp fixes can be easily ported to Cpp2IL, where desirable.
## Design
This will function as an assembly processor that runs after asset deserialization but before asset processing. The interface might look like this:
```cs
interface IAssemblyProcessor
{
void Process(IAssemblyManager manager);
}
```
|
process
|
assembly processing describe the new feature or enhancement context there is demand for an assembly processing system this was briefly discussed in processing could enable unit tested script fixing for better decompilation method bodies could be easily stripped without concerns over stability any fixes can be easily ported to where desirable design this will function as an assembly processor that runs after asset deserialization but before asset processing the interface might look like this cs interface iassemblyprocessor void process iassemblymanager manager
| 1
|
17,902
| 23,875,389,101
|
IssuesEvent
|
2022-09-07 18:31:36
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Build partially succeeded
|
devops/prod needs-more-info cba Pri1 devops-cicd-process/tech
|
Hi,
I have specified that a pipeline should be triggered when another pipeline completes.
It works great with resource trigger BUT it only works if the build has succeeded. In my case I get a couple of warnings which is ok which makes the build get status "Build partially succeeded". I to specify that this status is also ok to trigger a build.

---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 86285f72-9e28-da97-59bb-c29eb60f627d
* Version Independent ID: 18d5a591-a7d3-c261-6bff-8808ae433f54
* Content: [Configure pipeline triggers - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops)
* Content Source: [docs/pipelines/process/pipeline-triggers.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/pipeline-triggers.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @steved0x
* Microsoft Alias: **sdanie**
|
1.0
|
Build partially succeeded - Hi,
I have specified that a pipeline should be triggered when another pipeline completes.
It works great with resource trigger BUT it only works if the build has succeeded. In my case I get a couple of warnings which is ok which makes the build get status "Build partially succeeded". I to specify that this status is also ok to trigger a build.

---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 86285f72-9e28-da97-59bb-c29eb60f627d
* Version Independent ID: 18d5a591-a7d3-c261-6bff-8808ae433f54
* Content: [Configure pipeline triggers - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops)
* Content Source: [docs/pipelines/process/pipeline-triggers.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/pipeline-triggers.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @steved0x
* Microsoft Alias: **sdanie**
|
process
|
build partially succeeded hi i have specified that a pipeline should be triggered when another pipeline completes it works great with resource trigger but it only works if the build has succeeded in my case i get a couple of warnings which is ok which makes the build get status build partially succeeded i to specify that this status is also ok to trigger a build document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login microsoft alias sdanie
| 1
|
11,337
| 14,148,342,232
|
IssuesEvent
|
2020-11-10 22:22:17
|
googleapis/python-storage
|
https://api.github.com/repos/googleapis/python-storage
|
closed
|
'test_access_to_public_bucket' flakes with 503
|
api: storage flaky testing type: process
|
From [this test run](https://source.cloud.google.com/results/invocations/ee54aa46-e486-4be1-bdfe-829f0ac07ec8/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fpresubmit/log):
```python
_______________ TestAnonymousClient.test_access_to_public_bucket _______________
self = <test_system.TestAnonymousClient testMethod=test_access_to_public_bucket>
@vpcsc_config.skip_if_inside_vpcsc
def test_access_to_public_bucket(self):
anonymous = storage.Client.create_anonymous_client()
bucket = anonymous.bucket(self.PUBLIC_BUCKET)
> blob, = retry_429_503(bucket.list_blobs)(max_results=1)
tests/system/test_system.py:1498:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:212: in _items_iter
for page in self._page_iter(increment=False):
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:243: in _page_iter
page = self._next_page()
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:369: in _next_page
response = self._get_next_page_response()
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:419: in _get_next_page_response
method=self._HTTP_METHOD, path=self.path, query_params=params
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f5575846150>
method = 'GET', path = '/b/gcp-public-data-landsat/o'
query_params = {'maxResults': 1, 'projection': 'noAcl'}, data = None
content_type = None, headers = None, api_base_url = None, api_version = None
expect_json = True, _target_object = None, timeout = 60
def api_request(
self,
method,
path,
query_params=None,
data=None,
content_type=None,
headers=None,
api_base_url=None,
api_version=None,
expect_json=True,
_target_object=None,
timeout=_DEFAULT_TIMEOUT,
):
... # docstring elided
url = self.build_api_url(
path=path,
query_params=query_params,
api_base_url=api_base_url,
api_version=api_version,
)
# Making the executive decision that any dictionary
# data will be sent properly as JSON.
if data and isinstance(data, dict):
data = json.dumps(data)
content_type = "application/json"
response = self._make_request(
method=method,
url=url,
data=data,
content_type=content_type,
headers=headers,
target_object=_target_object,
timeout=timeout,
)
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 GET https://storage.googleapis.com/storage/v1/b/gcp-public-data-landsat/o?projection=noAcl&maxResults=1: <html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"/><title>Sorry...</title><style> body { font-family: verdana, arial, sans-serif; background-color: #fff; color: #000; }</style></head><body><div><table><tr><td><b><font face=sans-serif size=10><font color=#4285f4>G</font><font color=#ea4335>o</font><font color=#fbbc05>o</font><font color=#4285f4>g</font><font color=#34a853>l</font><font color=#ea4335>e</font></font></b></td><td style="text-align: left; vertical-align: bottom; padding-bottom: 15px; width: 50%"><div style="border-bottom: 1px solid #dfdfdf;">Sorry...</div></td></tr></table></div><div style="margin-left: 4em;"><h1>We're sorry...</h1><p>... but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.</p></div><div style="margin-left: 4em;">See <a href="https://support.google.com/websearch/answer/86640">Google Help</a> for more information.<br/><br/></div><div style="text-align: center; border-top: 1px solid #dfdfdf;"><a href="https://www.google.com">Google Home</a></div></body></html>
.nox/system-2-7/lib/python2.7/site-packages/google/cloud/_http.py:423: ServiceUnavailable
```
|
1.0
|
'test_access_to_public_bucket' flakes with 503 - From [this test run](https://source.cloud.google.com/results/invocations/ee54aa46-e486-4be1-bdfe-829f0ac07ec8/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fpresubmit/log):
```python
_______________ TestAnonymousClient.test_access_to_public_bucket _______________
self = <test_system.TestAnonymousClient testMethod=test_access_to_public_bucket>
@vpcsc_config.skip_if_inside_vpcsc
def test_access_to_public_bucket(self):
anonymous = storage.Client.create_anonymous_client()
bucket = anonymous.bucket(self.PUBLIC_BUCKET)
> blob, = retry_429_503(bucket.list_blobs)(max_results=1)
tests/system/test_system.py:1498:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:212: in _items_iter
for page in self._page_iter(increment=False):
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:243: in _page_iter
page = self._next_page()
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:369: in _next_page
response = self._get_next_page_response()
.nox/system-2-7/lib/python2.7/site-packages/google/api_core/page_iterator.py:419: in _get_next_page_response
method=self._HTTP_METHOD, path=self.path, query_params=params
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f5575846150>
method = 'GET', path = '/b/gcp-public-data-landsat/o'
query_params = {'maxResults': 1, 'projection': 'noAcl'}, data = None
content_type = None, headers = None, api_base_url = None, api_version = None
expect_json = True, _target_object = None, timeout = 60
def api_request(
self,
method,
path,
query_params=None,
data=None,
content_type=None,
headers=None,
api_base_url=None,
api_version=None,
expect_json=True,
_target_object=None,
timeout=_DEFAULT_TIMEOUT,
):
... # docstring elided
url = self.build_api_url(
path=path,
query_params=query_params,
api_base_url=api_base_url,
api_version=api_version,
)
# Making the executive decision that any dictionary
# data will be sent properly as JSON.
if data and isinstance(data, dict):
data = json.dumps(data)
content_type = "application/json"
response = self._make_request(
method=method,
url=url,
data=data,
content_type=content_type,
headers=headers,
target_object=_target_object,
timeout=timeout,
)
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 GET https://storage.googleapis.com/storage/v1/b/gcp-public-data-landsat/o?projection=noAcl&maxResults=1: <html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"/><title>Sorry...</title><style> body { font-family: verdana, arial, sans-serif; background-color: #fff; color: #000; }</style></head><body><div><table><tr><td><b><font face=sans-serif size=10><font color=#4285f4>G</font><font color=#ea4335>o</font><font color=#fbbc05>o</font><font color=#4285f4>g</font><font color=#34a853>l</font><font color=#ea4335>e</font></font></b></td><td style="text-align: left; vertical-align: bottom; padding-bottom: 15px; width: 50%"><div style="border-bottom: 1px solid #dfdfdf;">Sorry...</div></td></tr></table></div><div style="margin-left: 4em;"><h1>We're sorry...</h1><p>... but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.</p></div><div style="margin-left: 4em;">See <a href="https://support.google.com/websearch/answer/86640">Google Help</a> for more information.<br/><br/></div><div style="text-align: center; border-top: 1px solid #dfdfdf;"><a href="https://www.google.com">Google Home</a></div></body></html>
.nox/system-2-7/lib/python2.7/site-packages/google/cloud/_http.py:423: ServiceUnavailable
```
|
process
|
test access to public bucket flakes with from python testanonymousclient test access to public bucket self vpcsc config skip if inside vpcsc def test access to public bucket self anonymous storage client create anonymous client bucket anonymous bucket self public bucket blob retry bucket list blobs max results tests system test system py nox system lib site packages google api core page iterator py in items iter for page in self page iter increment false nox system lib site packages google api core page iterator py in page iter page self next page nox system lib site packages google api core page iterator py in next page response self get next page response nox system lib site packages google api core page iterator py in get next page response method self http method path self path query params params self method get path b gcp public data landsat o query params maxresults projection noacl data none content type none headers none api base url none api version none expect json true target object none timeout def api request self method path query params none data none content type none headers none api base url none api version none expect json true target object none timeout default timeout docstring elided url self build api url path path query params query params api base url api base url api version api version making the executive decision that any dictionary data will be sent properly as json if data and isinstance data dict data json dumps data content type application json response self make request method method url url data data content type content type headers headers target object target object timeout timeout if not response status code raise exceptions from http response response e serviceunavailable get sorry body font family verdana arial sans serif background color fff color g o o g l e sorry we re sorry but your computer or network may be sending automated queries to protect our users we can t process your request right now see for more information nox system lib site packages google cloud http py serviceunavailable
| 1
|
345,226
| 30,792,673,524
|
IssuesEvent
|
2023-07-31 17:20:45
|
MD-Anderson-Bioinformatics/NG-CHM
|
https://api.github.com/repos/MD-Anderson-Bioinformatics/NG-CHM
|
closed
|
Unable to deselect dendrogram sections with Ctrl-click.
|
bug backlog 2.17.6 passed retest
|
Originally Pivotal Tracker issue: https://www.pivotaltracker.com/story/show/173155631
bug entered by James M. Melott on 06/03/2020
Previously, I believe we were able to select multiple dendrogram areas on a map by pressing ctrl click and then deselect some of them by using ctrl-click on the same area. Ctrl-click only seems to add now and will not let you deselect an area.
|
1.0
|
Unable to deselect dendrogram sections with Ctrl-click. - Originally Pivotal Tracker issue: https://www.pivotaltracker.com/story/show/173155631
bug entered by James M. Melott on 06/03/2020
Previously, I believe we were able to select multiple dendrogram areas on a map by pressing ctrl click and then deselect some of them by using ctrl-click on the same area. Ctrl-click only seems to add now and will not let you deselect an area.
|
non_process
|
unable to deselect dendrogram sections with ctrl click originally pivotal tracker issue bug entered by james m melott on previously i believe we were able to select multiple dendrogram areas on a map by pressing ctrl click and then deselect some of them by using ctrl click on the same area ctrl click only seems to add now and will not let you deselect an area
| 0
|
763,899
| 26,777,614,827
|
IssuesEvent
|
2023-01-31 18:21:56
|
gamefreedomgit/Maelstrom
|
https://api.github.com/repos/gamefreedomgit/Maelstrom
|
closed
|
[Mage] Fire Mage -Haste does not provide proper ticks of fire dots
|
Class: Mage Spell Priority: High Status: Confirmed
|
**Description:** Haste should give more ticks for Pyroblast / living bomb / combustion / Firefrost bolt (glyph ), but it doesn't
1- **Combustion**:-
Combustiont icks 10 times no matter what your haste is.
Haste should increase its ticks as shown in the table below:-

2- **Pyroblast / living bomb / Firefrost bolt (glyph):-**
Haste increases the ticks at wrong percentages for the spells above, the table below shows how it should be

**How to reproduce:**
1- cast combustion with 5% haste for example, you will see it ticks for 10 instead of 11
2- Pyroblast / living bomb / Firefrost bolt (glyph) : have 12.5, cast pyroblast or living bomb or firefrost bolt on a target, you will see it only ticks for 4 times instead of 5
The attached excel file has more details about this issue, includes racials, talents and spells that increases haste and how much haste rating should be
[Fire mage damage over time ticks.xlsx](https://github.com/gamefreedomgit/Maelstrom/files/10502280/Fire.mage.damage.over.time.ticks.xlsx)

https://www.tauri-veins.tk/fire-mage-wow-pve-dps-statistics-priority-reforging
|
1.0
|
[Mage] Fire Mage -Haste does not provide proper ticks of fire dots - **Description:** Haste should give more ticks for Pyroblast / living bomb / combustion / Firefrost bolt (glyph ), but it doesn't
1- **Combustion**:-
Combustiont icks 10 times no matter what your haste is.
Haste should increase its ticks as shown in the table below:-

2- **Pyroblast / living bomb / Firefrost bolt (glyph):-**
Haste increases the ticks at wrong percentages for the spells above, the table below shows how it should be

**How to reproduce:**
1- cast combustion with 5% haste for example, you will see it ticks for 10 instead of 11
2- Pyroblast / living bomb / Firefrost bolt (glyph) : have 12.5, cast pyroblast or living bomb or firefrost bolt on a target, you will see it only ticks for 4 times instead of 5
The attached excel file has more details about this issue, includes racials, talents and spells that increases haste and how much haste rating should be
[Fire mage damage over time ticks.xlsx](https://github.com/gamefreedomgit/Maelstrom/files/10502280/Fire.mage.damage.over.time.ticks.xlsx)

https://www.tauri-veins.tk/fire-mage-wow-pve-dps-statistics-priority-reforging
|
non_process
|
fire mage haste does not provide proper ticks of fire dots description haste should give more ticks for pyroblast living bomb combustion firefrost bolt glyph but it doesn t combustion combustiont icks times no matter what your haste is haste should increase its ticks as shown in the table below pyroblast living bomb firefrost bolt glyph haste increases the ticks at wrong percentages for the spells above the table below shows how it should be how to reproduce cast combustion with haste for example you will see it ticks for instead of pyroblast living bomb firefrost bolt glyph have cast pyroblast or living bomb or firefrost bolt on a target you will see it only ticks for times instead of the attached excel file has more details about this issue includes racials talents and spells that increases haste and how much haste rating should be
| 0
|
10,236
| 13,096,800,369
|
IssuesEvent
|
2020-08-03 16:17:31
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
closed
|
[proposal] DistSQL Cache on TiDB
|
component/coprocessor proposal type/feature-request
|
## Feature Request
Add cache on DistSQL level to reduce TiKV calculation on DistSQL and make repeat SQL run faster after second time.
## Background
Before this proposal there has a DistSQL implements on TiKV side and this may introduce some problem:
* One DistSQL cache have one lock, so TiKV may block on this lock when many DistSQL is send to it, and it will even block insert and update.
* It will introduce more overhead on cache memory and life-cycle management on TiKV side.
## New Design
This design is more like HTTP cache mechanism.
* On TiKV side each DistSQL result will take Raft Log ID
* On TiDB side DistSQLCache will hold result data and Raft Log ID
* DistSQLCache key will generated by DistSQL’s SQL
* DistSQLCache entry will hold three data: Raft Log ID, TS and result’s data
* On DistSQL execute, if DistSQL hit cache it will send cached Raft Log ID to TiKV
* If TiKV found the Raft Log ID is match with this DistSQL’s snapshot, it will send a empty result and with match_raft_log_id field.
* If TiDB see match_raft_log_id DistSQL result will get from DistSQLCache.
This is basically like HTTP Last Modified, If-Modified-Since and 304 Not Modified logic.
Below is the pesudo code on TiDB side:
```
func DistSQL::execute() {
if (this.worth_cache()) {
cache_key = this.generate_cache_key()
entry = DistSQLCache.get(cache_key))
if (entry != NULL && entry.TS <= this.TS) {
result = this.send_to_tikv_with_raft_log_id(entry.raft_log_id)
if (result.match_raft_log_id) {
return entry.data
} else {
return result.data
}
} else {
result = this.send_to_tikv()
raft_log_id = result.raft_log_id
DistSQLCache.put(cache_key, raft_log_id, result.data)
return result.data
}
} else {
result = this.send_to_tikv()
return result.data
}
}
```
This design we will get some benefit:
* No cache lock on TiKV side
* No cache memory and life-cycle management on TiKV side
* Cache can implement as interface and can introduce plugin system to let end user to determine which cache system can be use.
## TODO List
1. Update protobuf structural for coprocessor
2. TiKV support fill Raft Log ID for each coprocessor query
3. TiKV support check Raft Log ID for coprocessor query
4. Add cache logic for TiDB
|
1.0
|
[proposal] DistSQL Cache on TiDB - ## Feature Request
Add cache on DistSQL level to reduce TiKV calculation on DistSQL and make repeat SQL run faster after second time.
## Background
Before this proposal there has a DistSQL implements on TiKV side and this may introduce some problem:
* One DistSQL cache have one lock, so TiKV may block on this lock when many DistSQL is send to it, and it will even block insert and update.
* It will introduce more overhead on cache memory and life-cycle management on TiKV side.
## New Design
This design is more like HTTP cache mechanism.
* On TiKV side each DistSQL result will take Raft Log ID
* On TiDB side DistSQLCache will hold result data and Raft Log ID
* DistSQLCache key will generated by DistSQL’s SQL
* DistSQLCache entry will hold three data: Raft Log ID, TS and result’s data
* On DistSQL execute, if DistSQL hit cache it will send cached Raft Log ID to TiKV
* If TiKV found the Raft Log ID is match with this DistSQL’s snapshot, it will send a empty result and with match_raft_log_id field.
* If TiDB see match_raft_log_id DistSQL result will get from DistSQLCache.
This is basically like HTTP Last Modified, If-Modified-Since and 304 Not Modified logic.
Below is the pesudo code on TiDB side:
```
func DistSQL::execute() {
if (this.worth_cache()) {
cache_key = this.generate_cache_key()
entry = DistSQLCache.get(cache_key))
if (entry != NULL && entry.TS <= this.TS) {
result = this.send_to_tikv_with_raft_log_id(entry.raft_log_id)
if (result.match_raft_log_id) {
return entry.data
} else {
return result.data
}
} else {
result = this.send_to_tikv()
raft_log_id = result.raft_log_id
DistSQLCache.put(cache_key, raft_log_id, result.data)
return result.data
}
} else {
result = this.send_to_tikv()
return result.data
}
}
```
This design we will get some benefit:
* No cache lock on TiKV side
* No cache memory and life-cycle management on TiKV side
* Cache can implement as interface and can introduce plugin system to let end user to determine which cache system can be use.
## TODO List
1. Update protobuf structural for coprocessor
2. TiKV support fill Raft Log ID for each coprocessor query
3. TiKV support check Raft Log ID for coprocessor query
4. Add cache logic for TiDB
|
process
|
distsql cache on tidb feature request add cache on distsql level to reduce tikv calculation on distsql and make repeat sql run faster after second time background before this proposal there has a distsql implements on tikv side and this may introduce some problem one distsql cache have one lock so tikv may block on this lock when many distsql is send to it and it will even block insert and update it will introduce more overhead on cache memory and life cycle management on tikv side new design this design is more like http cache mechanism on tikv side each distsql result will take raft log id on tidb side distsqlcache will hold result data and raft log id distsqlcache key will generated by distsql’s sql distsqlcache entry will hold three data raft log id ts and result’s data on distsql execute if distsql hit cache it will send cached raft log id to tikv if tikv found the raft log id is match with this distsql’s snapshot it will send a empty result and with match raft log id field if tidb see match raft log id distsql result will get from distsqlcache this is basically like http last modified if modified since and not modified logic below is the pesudo code on tidb side func distsql execute if this worth cache cache key this generate cache key entry distsqlcache get cache key if entry null entry ts this ts result this send to tikv with raft log id entry raft log id if result match raft log id return entry data else return result data else result this send to tikv raft log id result raft log id distsqlcache put cache key raft log id result data return result data else result this send to tikv return result data this design we will get some benefit no cache lock on tikv side no cache memory and life cycle management on tikv side cache can implement as interface and can introduce plugin system to let end user to determine which cache system can be use todo list update protobuf structural for coprocessor tikv support fill raft log id for each coprocessor query tikv support check raft log id for coprocessor query add cache logic for tidb
| 1
|
407,736
| 27,629,433,112
|
IssuesEvent
|
2023-03-10 09:43:27
|
sblauth/cashocs
|
https://api.github.com/repos/sblauth/cashocs
|
opened
|
[Documentation] Add version warning banners once supported by the theme
|
documentation
|
It would be beneficial to have (loud) version warning banners for the docs.
Once they are implemented in the theme, use them for cashocs.
|
1.0
|
[Documentation] Add version warning banners once supported by the theme - It would be beneficial to have (loud) version warning banners for the docs.
Once they are implemented in the theme, use them for cashocs.
|
non_process
|
add version warning banners once supported by the theme it would be beneficial to have loud version warning banners for the docs once they are implemented in the theme use them for cashocs
| 0
|
170,480
| 14,261,555,415
|
IssuesEvent
|
2020-11-20 11:31:08
|
nim-lang/Nim
|
https://api.github.com/repos/nim-lang/Nim
|
closed
|
Small tutorial error
|
Documentation Easy
|
Hi,
I'm recenly learning Nim and I think there might be an error in the tutorial at this
[link](https://nim-lang.org/docs/tut1.html).
In the sample code of Break Statement, the last echo says [echo "still in block" ] but I think the program should've left the block at that point. It's a small one but let me know if I'm wrong. Thanks
|
1.0
|
Small tutorial error - Hi,
I'm recenly learning Nim and I think there might be an error in the tutorial at this
[link](https://nim-lang.org/docs/tut1.html).
In the sample code of Break Statement, the last echo says [echo "still in block" ] but I think the program should've left the block at that point. It's a small one but let me know if I'm wrong. Thanks
|
non_process
|
small tutorial error hi i m recenly learning nim and i think there might be an error in the tutorial at this in the sample code of break statement the last echo says but i think the program should ve left the block at that point it s a small one but let me know if i m wrong thanks
| 0
|
411
| 2,851,466,527
|
IssuesEvent
|
2015-06-01 07:04:17
|
genomizer/genomizer-server
|
https://api.github.com/repos/genomizer/genomizer-server
|
closed
|
Using the username when uploading file
|
enhancement Processing
|
Could we let the author of an uploaded file be the username that is connected with the authorization token given in the request header?
|
1.0
|
Using the username when uploading file - Could we let the author of an uploaded file be the username that is connected with the authorization token given in the request header?
|
process
|
using the username when uploading file could we let the author of an uploaded file be the username that is connected with the authorization token given in the request header
| 1
|
8,944
| 12,058,026,070
|
IssuesEvent
|
2020-04-15 16:45:47
|
prisma/prisma2-docs
|
https://api.github.com/repos/prisma/prisma2-docs
|
closed
|
Meta tags not picked up on Twitter
|
process/candidate
|
Meta tags (title, description, image) don't seem to be picked up on Twitter, see this tweet: https://twitter.com/nikolasburk/status/1249797446458368003?s=12
More context in this [internal Slack conversation](https://prisma-company.slack.com/archives/C5Z9TH6N9/p1586813020007900).
|
1.0
|
Meta tags not picked up on Twitter - Meta tags (title, description, image) don't seem to be picked up on Twitter, see this tweet: https://twitter.com/nikolasburk/status/1249797446458368003?s=12
More context in this [internal Slack conversation](https://prisma-company.slack.com/archives/C5Z9TH6N9/p1586813020007900).
|
process
|
meta tags not picked up on twitter meta tags title description image don t seem to be picked up on twitter see this tweet more context in this
| 1
|
17,016
| 22,388,203,854
|
IssuesEvent
|
2022-06-17 03:38:01
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
layer_property(layer, 'path') returns NULL for local raster layer
|
Raster Processing Bug
|
### What is the bug or the crash?
The expression function `layer_property(layer, property)` says:
`path: File path to the layer data source. Only available for file based layers.`
For me it returns `NULL` for any raster file layer I have tried, in different locations on my local system. It works correctly for vector layers, e.g. the "world" easter egg layer's file.
### Steps to reproduce the issue
1. Load a raster file as layer
2. Use any place you can get the expression editor dialog (e.g. Processing -> "Geometry by expression")
3. Enter `layer_property('Your Layer ID', 'path')`
The preview should show the path to the layer but instead shows `NULL`.
### Versions
master
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
layer_property(layer, 'path') returns NULL for local raster layer - ### What is the bug or the crash?
The expression function `layer_property(layer, property)` says:
`path: File path to the layer data source. Only available for file based layers.`
For me it returns `NULL` for any raster file layer I have tried, in different locations on my local system. It works correctly for vector layers, e.g. the "world" easter egg layer's file.
### Steps to reproduce the issue
1. Load a raster file as layer
2. Use any place you can get the expression editor dialog (e.g. Processing -> "Geometry by expression")
3. Enter `layer_property('Your Layer ID', 'path')`
The preview should show the path to the layer but instead shows `NULL`.
### Versions
master
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
layer property layer path returns null for local raster layer what is the bug or the crash the expression function layer property layer property says path file path to the layer data source only available for file based layers for me it returns null for any raster file layer i have tried in different locations on my local system it works correctly for vector layers e g the world easter egg layer s file steps to reproduce the issue load a raster file as layer use any place you can get the expression editor dialog e g processing geometry by expression enter layer property your layer id path the preview should show the path to the layer but instead shows null versions master supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
331,447
| 24,308,547,719
|
IssuesEvent
|
2022-09-29 19:46:50
|
zowe/zowe-cli
|
https://api.github.com/repos/zowe/zowe-cli
|
closed
|
Contributing.md needs enhanced with information on how to open issues
|
documentation
|
_From @MarkAckert on April 17, 2018 12:27_
[MIGRATED]
It's nice the fact that we now have a CONTRIBUTING.md file.
However, there is no information on how to open an issue.
So, it would be nice if had some information there given that GitHub itself recommends you to see the contributing guidelines.

_Copied from original issue: gizafoundation/brightside#11_
|
1.0
|
Contributing.md needs enhanced with information on how to open issues - _From @MarkAckert on April 17, 2018 12:27_
[MIGRATED]
It's nice the fact that we now have a CONTRIBUTING.md file.
However, there is no information on how to open an issue.
So, it would be nice if had some information there given that GitHub itself recommends you to see the contributing guidelines.

_Copied from original issue: gizafoundation/brightside#11_
|
non_process
|
contributing md needs enhanced with information on how to open issues from markackert on april it s nice the fact that we now have a contributing md file however there is no information on how to open an issue so it would be nice if had some information there given that github itself recommends you to see the contributing guidelines copied from original issue gizafoundation brightside
| 0
|
107,471
| 4,309,220,704
|
IssuesEvent
|
2016-07-21 15:21:58
|
opengovfoundation/madison
|
https://api.github.com/repos/opengovfoundation/madison
|
closed
|
Flagging comment results in 500
|
Needs: Angular Priority Type: Bug
|
Looks like the url string isn't being put together properly, posts to `https://drafts.dc.gov/api/docs/34/comments/undefined/flags`
|
1.0
|
Flagging comment results in 500 - Looks like the url string isn't being put together properly, posts to `https://drafts.dc.gov/api/docs/34/comments/undefined/flags`
|
non_process
|
flagging comment results in looks like the url string isn t being put together properly posts to
| 0
|
264
| 2,694,666,459
|
IssuesEvent
|
2015-04-01 21:36:04
|
dalehenrich/metacello-work
|
https://api.github.com/repos/dalehenrich/metacello-work
|
closed
|
Bootstrapify current metacello
|
in process
|
#316 fixed a crucial dependency for Squeak, but it is necessary that the current state is incorporated into the mcz-based bootstrap (There is no trace of metacello in Squeak, yet)
|
1.0
|
Bootstrapify current metacello - #316 fixed a crucial dependency for Squeak, but it is necessary that the current state is incorporated into the mcz-based bootstrap (There is no trace of metacello in Squeak, yet)
|
process
|
bootstrapify current metacello fixed a crucial dependency for squeak but it is necessary that the current state is incorporated into the mcz based bootstrap there is no trace of metacello in squeak yet
| 1
|
51,832
| 27,258,791,214
|
IssuesEvent
|
2023-02-22 13:31:12
|
SixLabors/ImageSharp
|
https://api.github.com/repos/SixLabors/ImageSharp
|
closed
|
Vector{128,256} operations that use MmShuffle fall back to method call
|
area:performance
|
### Prerequisites
- [X] I have written a descriptive issue title
- [X] I have verified that I am running the latest version of ImageSharp
- [X] I have verified if the problem exist in both `DEBUG` and `RELEASE` mode
- [X] I have searched [open](https://github.com/SixLabors/ImageSharp/issues) and [closed](https://github.com/SixLabors/ImageSharp/issues?q=is%3Aissue+is%3Aclosed) issues to ensure it has not already been reported
### ImageSharp version
Current main branch
### Other ImageSharp packages and versions
none
### Environment (Operating system, version and so on)
all .NET supported
### .NET Framework version
all
### Description
While working on https://github.com/SixLabors/ImageSharp/issues/1762 I recognized that methods that use https://github.com/SixLabors/ImageSharp/blob/c661ab1478419004271a1bd2281f5d4e478840b9/src/ImageSharp/Common/Helpers/SimdUtils.Shuffle.cs#L236-L238 won't emit platform-intrinsics, rather fallback to a method call as the value isn't a constant.
E.g. `Vp8Encoding.FTransformPass1SSE2` looks after inlining the vector constants like
```asm
push rdi
push rsi
push rbx
sub rsp,60
vzeroupper
xor eax,eax
mov [rsp+50],rax
mov [rsp+58],rax
mov rsi,rdx
mov rdi,r8
mov rbx,r9
vmovupd xmm0,[rcx]
lea rcx,[rsp+40]
vmovapd [rsp+30],xmm0
lea rdx,[rsp+30]
mov r8d,0B1
call System.Runtime.Intrinsics.X86.Sse2.ShuffleHigh(System.Runtime.Intrinsics.Vector128`1<Int16>, Byte)
vmovupd xmm0,[rsi]
lea rcx,[rsp+50]
vmovapd [rsp+30],xmm0
lea rdx,[rsp+30]
mov r8d,0B1
call System.Runtime.Intrinsics.X86.Sse2.ShuffleHigh(System.Runtime.Intrinsics.Vector128`1<Int16>, Byte)
vmovapd xmm0,[rsp+40]
vpunpcklqdq xmm0,xmm0,[rsp+50]
vmovapd xmm1,[rsp+40]
vpunpckhqdq xmm1,xmm1,[rsp+50]
vpaddw xmm2,xmm0,xmm1
vpmaddwd xmm3,xmm2,[7FF7D3640A20]
vpmaddwd xmm2,xmm2,[7FF7D3640A30]
vpsubw xmm0,xmm0,xmm1
vpmaddwd xmm1,xmm0,[7FF7D3640A40]
vpaddd xmm1,xmm1,[7FF7D3640A50]
vpsrad xmm1,xmm1,9
vpmaddwd xmm0,xmm0,[7FF7D3640A60]
vpaddd xmm0,xmm0,[7FF7D3640A70]
vpsrad xmm0,xmm0,9
vpackssdw xmm0,xmm1,xmm0
vpackssdw xmm1,xmm3,xmm2
vpunpcklwd xmm2,xmm1,xmm0
vpunpckhwd xmm0,xmm1,xmm0
vpunpckhdq xmm1,xmm2,xmm0
vpunpckldq xmm0,xmm2,xmm0
vmovupd [rdi],xmm0
mov rcx,rbx
vmovapd [rsp+20],xmm1
lea rdx,[rsp+20]
mov r8d,4E
call System.Runtime.Intrinsics.X86.Sse2.Shuffle(System.Runtime.Intrinsics.Vector128`1<Int32>, Byte)
nop
add rsp,60
pop rbx
pop rsi
pop rdi
ret
; Total bytes of code 245
```
If instead `SimdUtils.Shuffle.MmShuffle` the constant is given as literal, then the code boils down to:
```asm
vzeroupper
vpshufhw xmm0,[rdx],0B1
vpshufhw xmm1,[rcx],0B1
vpunpcklqdq xmm2,xmm1,xmm0
vpunpckhqdq xmm0,xmm1,xmm0
vpaddw xmm1,xmm2,xmm0
vpmaddwd xmm3,xmm1,[7FF7D365DAC0]
vpmaddwd xmm1,xmm1,[7FF7D365DAD0]
vpsubw xmm0,xmm2,xmm0
vpmaddwd xmm2,xmm0,[7FF7D365DAE0]
vpaddd xmm2,xmm2,[7FF7D365DAF0]
vpsrad xmm2,xmm2,9
vpmaddwd xmm0,xmm0,[7FF7D365DB00]
vpaddd xmm0,xmm0,[7FF7D365DB10]
vpsrad xmm0,xmm0,9
vpackssdw xmm0,xmm2,xmm0
vpackssdw xmm1,xmm3,xmm1
vpunpcklwd xmm2,xmm1,xmm0
vpunpckhwd xmm0,xmm1,xmm0
vpunpckhdq xmm1,xmm2,xmm0
vpunpckldq xmm0,xmm2,xmm0
vmovupd [r8],xmm0
vpshufd xmm0,xmm1,4E
vmovupd [r9],xmm0
ret
; Total bytes of code 127
```
This is a de-facto a JIT-limitation, recorded in https://github.com/dotnet/runtime/issues/9989 and https://github.com/dotnet/runtime/issues/38003, also noticed in https://github.com/SixLabors/ImageSharp/pull/1517#discussion_r562059328
As this is quite a difference in code-gen, hence in perf it should show up too, I propose to change to typing the shuffle literals explicetely. E.g.
```diff
-Vector128<short> shuf01_p = Sse2.ShuffleHigh(row01, SimdUtils.Shuffle.MmShuffle(2, 3, 0, 1)); // or any similar pattern
+Vector128<short> shuf01_p = Sse2.ShuffleHigh(row01, 0xB1); // MmShuffle(2, 3, 0, 1)
```
---
If OK I'd like to tackle this in one shot with https://github.com/SixLabors/ImageSharp/issues/1762 (as I'm touching these pieces anyway).
Edit: I was too eager, the PR is out...
### Steps to Reproduce
Look at dissassembly of any method that uses `SimdUtils.Shuffle.MmShuffle`.
### Images
_No response_
|
True
|
Vector{128,256} operations that use MmShuffle fall back to method call - ### Prerequisites
- [X] I have written a descriptive issue title
- [X] I have verified that I am running the latest version of ImageSharp
- [X] I have verified if the problem exist in both `DEBUG` and `RELEASE` mode
- [X] I have searched [open](https://github.com/SixLabors/ImageSharp/issues) and [closed](https://github.com/SixLabors/ImageSharp/issues?q=is%3Aissue+is%3Aclosed) issues to ensure it has not already been reported
### ImageSharp version
Current main branch
### Other ImageSharp packages and versions
none
### Environment (Operating system, version and so on)
all .NET supported
### .NET Framework version
all
### Description
While working on https://github.com/SixLabors/ImageSharp/issues/1762 I recognized that methods that use https://github.com/SixLabors/ImageSharp/blob/c661ab1478419004271a1bd2281f5d4e478840b9/src/ImageSharp/Common/Helpers/SimdUtils.Shuffle.cs#L236-L238 won't emit platform-intrinsics, rather fallback to a method call as the value isn't a constant.
E.g. `Vp8Encoding.FTransformPass1SSE2` looks after inlining the vector constants like
```asm
push rdi
push rsi
push rbx
sub rsp,60
vzeroupper
xor eax,eax
mov [rsp+50],rax
mov [rsp+58],rax
mov rsi,rdx
mov rdi,r8
mov rbx,r9
vmovupd xmm0,[rcx]
lea rcx,[rsp+40]
vmovapd [rsp+30],xmm0
lea rdx,[rsp+30]
mov r8d,0B1
call System.Runtime.Intrinsics.X86.Sse2.ShuffleHigh(System.Runtime.Intrinsics.Vector128`1<Int16>, Byte)
vmovupd xmm0,[rsi]
lea rcx,[rsp+50]
vmovapd [rsp+30],xmm0
lea rdx,[rsp+30]
mov r8d,0B1
call System.Runtime.Intrinsics.X86.Sse2.ShuffleHigh(System.Runtime.Intrinsics.Vector128`1<Int16>, Byte)
vmovapd xmm0,[rsp+40]
vpunpcklqdq xmm0,xmm0,[rsp+50]
vmovapd xmm1,[rsp+40]
vpunpckhqdq xmm1,xmm1,[rsp+50]
vpaddw xmm2,xmm0,xmm1
vpmaddwd xmm3,xmm2,[7FF7D3640A20]
vpmaddwd xmm2,xmm2,[7FF7D3640A30]
vpsubw xmm0,xmm0,xmm1
vpmaddwd xmm1,xmm0,[7FF7D3640A40]
vpaddd xmm1,xmm1,[7FF7D3640A50]
vpsrad xmm1,xmm1,9
vpmaddwd xmm0,xmm0,[7FF7D3640A60]
vpaddd xmm0,xmm0,[7FF7D3640A70]
vpsrad xmm0,xmm0,9
vpackssdw xmm0,xmm1,xmm0
vpackssdw xmm1,xmm3,xmm2
vpunpcklwd xmm2,xmm1,xmm0
vpunpckhwd xmm0,xmm1,xmm0
vpunpckhdq xmm1,xmm2,xmm0
vpunpckldq xmm0,xmm2,xmm0
vmovupd [rdi],xmm0
mov rcx,rbx
vmovapd [rsp+20],xmm1
lea rdx,[rsp+20]
mov r8d,4E
call System.Runtime.Intrinsics.X86.Sse2.Shuffle(System.Runtime.Intrinsics.Vector128`1<Int32>, Byte)
nop
add rsp,60
pop rbx
pop rsi
pop rdi
ret
; Total bytes of code 245
```
If instead `SimdUtils.Shuffle.MmShuffle` the constant is given as literal, then the code boils down to:
```asm
vzeroupper
vpshufhw xmm0,[rdx],0B1
vpshufhw xmm1,[rcx],0B1
vpunpcklqdq xmm2,xmm1,xmm0
vpunpckhqdq xmm0,xmm1,xmm0
vpaddw xmm1,xmm2,xmm0
vpmaddwd xmm3,xmm1,[7FF7D365DAC0]
vpmaddwd xmm1,xmm1,[7FF7D365DAD0]
vpsubw xmm0,xmm2,xmm0
vpmaddwd xmm2,xmm0,[7FF7D365DAE0]
vpaddd xmm2,xmm2,[7FF7D365DAF0]
vpsrad xmm2,xmm2,9
vpmaddwd xmm0,xmm0,[7FF7D365DB00]
vpaddd xmm0,xmm0,[7FF7D365DB10]
vpsrad xmm0,xmm0,9
vpackssdw xmm0,xmm2,xmm0
vpackssdw xmm1,xmm3,xmm1
vpunpcklwd xmm2,xmm1,xmm0
vpunpckhwd xmm0,xmm1,xmm0
vpunpckhdq xmm1,xmm2,xmm0
vpunpckldq xmm0,xmm2,xmm0
vmovupd [r8],xmm0
vpshufd xmm0,xmm1,4E
vmovupd [r9],xmm0
ret
; Total bytes of code 127
```
This is a de-facto a JIT-limitation, recorded in https://github.com/dotnet/runtime/issues/9989 and https://github.com/dotnet/runtime/issues/38003, also noticed in https://github.com/SixLabors/ImageSharp/pull/1517#discussion_r562059328
As this is quite a difference in code-gen, hence in perf it should show up too, I propose to change to typing the shuffle literals explicetely. E.g.
```diff
-Vector128<short> shuf01_p = Sse2.ShuffleHigh(row01, SimdUtils.Shuffle.MmShuffle(2, 3, 0, 1)); // or any similar pattern
+Vector128<short> shuf01_p = Sse2.ShuffleHigh(row01, 0xB1); // MmShuffle(2, 3, 0, 1)
```
---
If OK I'd like to tackle this in one shot with https://github.com/SixLabors/ImageSharp/issues/1762 (as I'm touching these pieces anyway).
Edit: I was too eager, the PR is out...
### Steps to Reproduce
Look at dissassembly of any method that uses `SimdUtils.Shuffle.MmShuffle`.
### Images
_No response_
|
non_process
|
vector operations that use mmshuffle fall back to method call prerequisites i have written a descriptive issue title i have verified that i am running the latest version of imagesharp i have verified if the problem exist in both debug and release mode i have searched and issues to ensure it has not already been reported imagesharp version current main branch other imagesharp packages and versions none environment operating system version and so on all net supported net framework version all description while working on i recognized that methods that use won t emit platform intrinsics rather fallback to a method call as the value isn t a constant e g looks after inlining the vector constants like asm push rdi push rsi push rbx sub rsp vzeroupper xor eax eax mov rax mov rax mov rsi rdx mov rdi mov rbx vmovupd lea rcx vmovapd lea rdx mov call system runtime intrinsics shufflehigh system runtime intrinsics byte vmovupd lea rcx vmovapd lea rdx mov call system runtime intrinsics shufflehigh system runtime intrinsics byte vmovapd vpunpcklqdq vmovapd vpunpckhqdq vpaddw vpmaddwd vpmaddwd vpsubw vpmaddwd vpaddd vpsrad vpmaddwd vpaddd vpsrad vpackssdw vpackssdw vpunpcklwd vpunpckhwd vpunpckhdq vpunpckldq vmovupd mov rcx rbx vmovapd lea rdx mov call system runtime intrinsics shuffle system runtime intrinsics byte nop add rsp pop rbx pop rsi pop rdi ret total bytes of code if instead simdutils shuffle mmshuffle the constant is given as literal then the code boils down to asm vzeroupper vpshufhw vpshufhw vpunpcklqdq vpunpckhqdq vpaddw vpmaddwd vpmaddwd vpsubw vpmaddwd vpaddd vpsrad vpmaddwd vpaddd vpsrad vpackssdw vpackssdw vpunpcklwd vpunpckhwd vpunpckhdq vpunpckldq vmovupd vpshufd vmovupd ret total bytes of code this is a de facto a jit limitation recorded in and also noticed in as this is quite a difference in code gen hence in perf it should show up too i propose to change to typing the shuffle literals explicetely e g diff p shufflehigh simdutils shuffle mmshuffle or any similar pattern p shufflehigh mmshuffle if ok i d like to tackle this in one shot with as i m touching these pieces anyway edit i was too eager the pr is out steps to reproduce look at dissassembly of any method that uses simdutils shuffle mmshuffle images no response
| 0
|
2,922
| 5,914,745,296
|
IssuesEvent
|
2017-05-22 04:45:19
|
uccser/verto
|
https://api.github.com/repos/uccser/verto
|
closed
|
Add ability to create inline Scratch images
|
feature processor implementation update
|
An author should be able to add an inline Scratch image, using the following (or similar syntax):
```
- Click inside the `scratch:say [Hello] for (2) secs` block and type to change
“Hello” to what you want to display on the the screen.
```
The contents should be written to a file in same way as the block Scratch images.
The `split` and `random` parameters are not allowed in this tag.
|
1.0
|
Add ability to create inline Scratch images - An author should be able to add an inline Scratch image, using the following (or similar syntax):
```
- Click inside the `scratch:say [Hello] for (2) secs` block and type to change
“Hello” to what you want to display on the the screen.
```
The contents should be written to a file in same way as the block Scratch images.
The `split` and `random` parameters are not allowed in this tag.
|
process
|
add ability to create inline scratch images an author should be able to add an inline scratch image using the following or similar syntax click inside the scratch say for secs block and type to change “hello” to what you want to display on the the screen the contents should be written to a file in same way as the block scratch images the split and random parameters are not allowed in this tag
| 1
|
8,118
| 11,303,161,623
|
IssuesEvent
|
2020-01-17 19:25:28
|
bcgov/entity
|
https://api.github.com/repos/bcgov/entity
|
closed
|
Set up ZenHub
|
OCM_Int Processes
|
Set up ZenHub
- [x] get site set up
- [x] advise KM/Katherine; provide overview; agree on usage norms
- [x] add labels
- [x] pilot with a couple of scrums
- [x] adapt usage norms
|
1.0
|
Set up ZenHub - Set up ZenHub
- [x] get site set up
- [x] advise KM/Katherine; provide overview; agree on usage norms
- [x] add labels
- [x] pilot with a couple of scrums
- [x] adapt usage norms
|
process
|
set up zenhub set up zenhub get site set up advise km katherine provide overview agree on usage norms add labels pilot with a couple of scrums adapt usage norms
| 1
|
19,018
| 25,020,486,044
|
IssuesEvent
|
2022-11-03 23:44:54
|
Almamu/linux-wallpaperengine
|
https://api.github.com/repos/Almamu/linux-wallpaperengine
|
closed
|
[BGFIX] 2270452711
|
bug audio processing
|
**Wallpaper Engine Background(s)**
[2270452711](https://steamcommunity.com/sharedfiles/filedetails/?id=2270452711)
**Console output**
```
z% wallengine 2270452711
No scene.pkg file found at /home/tarulia/.local/share/Steam/steamapps/workshop/content/431960/2270452711/. Defaulting to normal folder storage
Found wallpaper engine's assets at /home/tarulia/.local/share/Steam/steamapps/common/wallpaper_engine/assets
terminate called after throwing an instance of 'WallpaperEngine::Assets::CAssetLoadException'
what(): Cannot find file gifscene.json: Cannot find file in any of the containers
zsh: IOT instruction (core dumped) wallengine 2270452711
```
```
z% coredumpctl debug
PID: 970418 (wallengine)
UID: 1000 (tarulia)
GID: 1000 (tarulia)
Signal: 6 (ABRT)
Timestamp: Thu 2022-07-14 15:00:50 CEST (5s ago)
Command Line: wallengine 2270452711
Executable: /usr/bin/wallengine
Control Group: /user.slice/user-1000.slice/user@1000.service/app.slice/app-konsole-033ed2b22d90494298c8c1351f4e45e3.scope
Unit: user@1000.service
User Unit: app-konsole-033ed2b22d90494298c8c1351f4e45e3.scope
Slice: user-1000.slice
Owner UID: 1000 (tarulia)
Boot ID: 25a61743acf84a71ac5b2fd4a85cd29d
Machine ID: 514d37dbc9b846beb68f6931e905b78b
Hostname: fedora
Storage: /var/lib/systemd/coredump/core.wallengine.1000.25a61743acf84a71ac5b2fd4a85cd29d.970418.1657803650000000.zst (present)
Disk Size: 3.9M
Message: Process 970418 (wallengine) of user 1000 dumped core.
Module linux-vdso.so.1 with build-id 5785fbe1cc105912f683ad7629d33f7d6e3692ce
Module libFLAC.so.8 with build-id b07e698f25b5c96b858a4224a09db7009edfd296
Metadata for module libFLAC.so.8 owned by FDO found: {
"type" : "rpm",
"name" : "flac",
"version" : "1.3.4-1.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libasyncns.so.0 with build-id 4e12c4bfd3608b4e4b69679dbd61bed6e7fd1ab4
Metadata for module libasyncns.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "libasyncns",
"version" : "0.8-22.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libsndfile.so.1 with build-id a3ae47d4cba9a2f0fc45aa01fac8a116fc841e16
Metadata for module libsndfile.so.1 owned by FDO found: {
"type" : "rpm",
"name" : "libsndfile",
"version" : "1.0.31-7.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulsecommon-15.0.so with build-id 1b44db205944eb0f35256db3a2affc0c08a6ddf7
Metadata for module libpulsecommon-15.0.so owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulse.so.0 with build-id 47a56fe79c3265aad602438e6ea27960ae796f69
Metadata for module libpulse.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulse-simple.so.0 with build-id 3c01fe813c1aa996ec2757e437f0cb7a73d7d1ae
Metadata for module libpulse-simple.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libdbus-1.so.3 with build-id 180fe9c567fe8c90bdbe2a2c9316ef745621e4b2
Metadata for module libdbus-1.so.3 owned by FDO found: {
"type" : "rpm",
"name" : "dbus",
"version" : "1.14.0-1.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libtinfo.so.6 with build-id 954d12f7d8216fde821db122a4768ee255382a63
Metadata for module libtinfo.so.6 owned by FDO found: {
"type" : "rpm",
"name" : "ncurses",
"version" : "6.2-9.20210508.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libedit.so.0 with build-id 786ebbe150c63e27beb2957d717bece33431af6f
Stack trace of thread 970418:
#0 0x00007f4fc5a8ec4c __pthread_kill_implementation (libc.so.6 + 0x8ec4c)
#1 0x00007f4fc5a3e9c6 raise (libc.so.6 + 0x3e9c6)
#2 0x00007f4fc5a287f4 abort (libc.so.6 + 0x287f4)
#3 0x00007f4fc5ea2b57 _ZN9__gnu_cxx27__verbose_terminate_handlerEv.cold (libstdc++.so.6 + 0xa2b57)
#4 0x00007f4fc5eae43c _ZN10__cxxabiv111__terminateEPFvvE (libstdc++.so.6 + 0xae43c)
#5 0x00007f4fc5eae4a7 _ZSt9terminatev (libstdc++.so.6 + 0xae4a7)
#6 0x00007f4fc5eae708 __cxa_throw (libstdc++.so.6 + 0xae708)
#7 0x000056031e19de32 _ZN15WallpaperEngine6Assets18CCombinedContainer8readFileENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPj.cold (wallengine + 0x1ce32)
#8 0x000056031e1c5873 _ZN15WallpaperEngine10FileSystem12loadFullFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x44873)
#9 0x000056031e1cb49c _ZN15WallpaperEngine4Core6CScene8fromFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x4a49c)
#10 0x000056031e1cc3d7 _ZN15WallpaperEngine4Core8CProject8fromFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x4b3d7)
#11 0x000056031e1a3f59 main (wallengine + 0x22f59)
#12 0x00007f4fc5a29550 __libc_start_call_main (libc.so.6 + 0x29550)
#13 0x00007f4fc5a29609 __libc_start_main@@GLIBC_2.34 (libc.so.6 + 0x29609)
#14 0x000056031e1a56c5 _start (wallengine + 0x246c5)
Stack trace of thread 970420:
#0 0x00007f4fc5ad94f7 getdents64 (libc.so.6 + 0xd94f7)
#1 0x00007f4fc5ad95a5 readdir64 (libc.so.6 + 0xd95a5)
#2 0x00007f4fad8ba50d is_two_character_sub_directory (radeonsi_dri.so + 0xba50d)
#3 0x00007f4fad8ba668 choose_lru_file_matching (radeonsi_dri.so + 0xba668)
#4 0x00007f4fad8bae33 disk_cache_evict_lru_item (radeonsi_dri.so + 0xbae33)
#5 0x00007f4fad8b9cbd cache_put (radeonsi_dri.so + 0xb9cbd)
#6 0x00007f4fad8c19c4 util_queue_thread_func (radeonsi_dri.so + 0xc19c4)
#7 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#8 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#9 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970419:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970421:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970425:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970426:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970428:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970427:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970433:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970424:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970439:
#0 0x00007f4fc5b05cee ppoll (libc.so.6 + 0x105cee)
#1 0x00007f4fa402d341 pa_mainloop_poll (libpulse.so.0 + 0x1e341)
#2 0x00007f4fa4037a5a pa_mainloop_iterate (libpulse.so.0 + 0x28a5a)
#3 0x00007f4fa4037b00 pa_mainloop_run (libpulse.so.0 + 0x28b00)
#4 0x00007f4fbc9380f3 HotplugThread (libSDL2-2.0.so.0 + 0xec0f3)
#5 0x00007f4fbc9799ef RunThread.lto_priv.0 (libSDL2-2.0.so.0 + 0x12d9ef)
#6 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#7 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970431:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970422:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970437:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970423:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970435:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970429:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970434:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970438:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970436:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970432:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970430:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
ELF object binary architecture: AMD x86-64
GNU gdb (GDB) Fedora 12.1-1.fc36
Copyright (C) 2022 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Type "show copying" and "show warranty" for details.
This GDB was configured as "x86_64-redhat-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
<https://www.gnu.org/software/gdb/bugs/>.
Find the GDB manual and other documentation resources online at:
<http://www.gnu.org/software/gdb/documentation/>.
For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from /usr/bin/wallengine...
Reading symbols from /usr/lib/debug/usr/bin/wallengine-0.0.1^20220610.ec6164c-1.fc36.x86_64.debug...
warning: Can't open file /memfd:pulseaudio (deleted) during file-backed mapping note processing
warning: Can't open file /memfd:xshmfence (deleted) during file-backed mapping note processing
[New LWP 970418]
[New LWP 970420]
[New LWP 970419]
[New LWP 970421]
[New LWP 970425]
[New LWP 970426]
[New LWP 970428]
[New LWP 970427]
[New LWP 970433]
[New LWP 970424]
[New LWP 970439]
[New LWP 970431]
[New LWP 970422]
[New LWP 970437]
[New LWP 970423]
[New LWP 970435]
[New LWP 970429]
[New LWP 970434]
[New LWP 970438]
[New LWP 970436]
[New LWP 970432]
[New LWP 970430]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
--Type <RET> for more, q to quit, c to continue without paging--c
Core was generated by `wallengine 2270452711'.
Program terminated with signal SIGABRT, Aborted.
#0 __pthread_kill_implementation (threadid=<optimized out>, signo=signo@entry=6, no_tid=no_tid@entry=0) at pthread_kill.c:44
44 return INTERNAL_SYSCALL_ERROR_P (ret) ? INTERNAL_SYSCALL_ERRNO (ret) : 0;
[Current thread is 1 (Thread 0x7f4fbe5ef3c0 (LWP 970418))]
(gdb)
```
**Desktop (please complete the following information):**
- OS: Fedora 36
- Desktop Environment: KDE
- Window Manager: KWin
|
1.0
|
[BGFIX] 2270452711 - **Wallpaper Engine Background(s)**
[2270452711](https://steamcommunity.com/sharedfiles/filedetails/?id=2270452711)
**Console output**
```
z% wallengine 2270452711
No scene.pkg file found at /home/tarulia/.local/share/Steam/steamapps/workshop/content/431960/2270452711/. Defaulting to normal folder storage
Found wallpaper engine's assets at /home/tarulia/.local/share/Steam/steamapps/common/wallpaper_engine/assets
terminate called after throwing an instance of 'WallpaperEngine::Assets::CAssetLoadException'
what(): Cannot find file gifscene.json: Cannot find file in any of the containers
zsh: IOT instruction (core dumped) wallengine 2270452711
```
```
z% coredumpctl debug
PID: 970418 (wallengine)
UID: 1000 (tarulia)
GID: 1000 (tarulia)
Signal: 6 (ABRT)
Timestamp: Thu 2022-07-14 15:00:50 CEST (5s ago)
Command Line: wallengine 2270452711
Executable: /usr/bin/wallengine
Control Group: /user.slice/user-1000.slice/user@1000.service/app.slice/app-konsole-033ed2b22d90494298c8c1351f4e45e3.scope
Unit: user@1000.service
User Unit: app-konsole-033ed2b22d90494298c8c1351f4e45e3.scope
Slice: user-1000.slice
Owner UID: 1000 (tarulia)
Boot ID: 25a61743acf84a71ac5b2fd4a85cd29d
Machine ID: 514d37dbc9b846beb68f6931e905b78b
Hostname: fedora
Storage: /var/lib/systemd/coredump/core.wallengine.1000.25a61743acf84a71ac5b2fd4a85cd29d.970418.1657803650000000.zst (present)
Disk Size: 3.9M
Message: Process 970418 (wallengine) of user 1000 dumped core.
Module linux-vdso.so.1 with build-id 5785fbe1cc105912f683ad7629d33f7d6e3692ce
Module libFLAC.so.8 with build-id b07e698f25b5c96b858a4224a09db7009edfd296
Metadata for module libFLAC.so.8 owned by FDO found: {
"type" : "rpm",
"name" : "flac",
"version" : "1.3.4-1.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libasyncns.so.0 with build-id 4e12c4bfd3608b4e4b69679dbd61bed6e7fd1ab4
Metadata for module libasyncns.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "libasyncns",
"version" : "0.8-22.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libsndfile.so.1 with build-id a3ae47d4cba9a2f0fc45aa01fac8a116fc841e16
Metadata for module libsndfile.so.1 owned by FDO found: {
"type" : "rpm",
"name" : "libsndfile",
"version" : "1.0.31-7.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulsecommon-15.0.so with build-id 1b44db205944eb0f35256db3a2affc0c08a6ddf7
Metadata for module libpulsecommon-15.0.so owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulse.so.0 with build-id 47a56fe79c3265aad602438e6ea27960ae796f69
Metadata for module libpulse.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libpulse-simple.so.0 with build-id 3c01fe813c1aa996ec2757e437f0cb7a73d7d1ae
Metadata for module libpulse-simple.so.0 owned by FDO found: {
"type" : "rpm",
"name" : "pulseaudio",
"version" : "15.0-5.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libdbus-1.so.3 with build-id 180fe9c567fe8c90bdbe2a2c9316ef745621e4b2
Metadata for module libdbus-1.so.3 owned by FDO found: {
"type" : "rpm",
"name" : "dbus",
"version" : "1.14.0-1.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libtinfo.so.6 with build-id 954d12f7d8216fde821db122a4768ee255382a63
Metadata for module libtinfo.so.6 owned by FDO found: {
"type" : "rpm",
"name" : "ncurses",
"version" : "6.2-9.20210508.fc36",
"architecture" : "x86_64",
"osCpe" : "cpe:/o:fedoraproject:fedora:36"
}
Module libedit.so.0 with build-id 786ebbe150c63e27beb2957d717bece33431af6f
Stack trace of thread 970418:
#0 0x00007f4fc5a8ec4c __pthread_kill_implementation (libc.so.6 + 0x8ec4c)
#1 0x00007f4fc5a3e9c6 raise (libc.so.6 + 0x3e9c6)
#2 0x00007f4fc5a287f4 abort (libc.so.6 + 0x287f4)
#3 0x00007f4fc5ea2b57 _ZN9__gnu_cxx27__verbose_terminate_handlerEv.cold (libstdc++.so.6 + 0xa2b57)
#4 0x00007f4fc5eae43c _ZN10__cxxabiv111__terminateEPFvvE (libstdc++.so.6 + 0xae43c)
#5 0x00007f4fc5eae4a7 _ZSt9terminatev (libstdc++.so.6 + 0xae4a7)
#6 0x00007f4fc5eae708 __cxa_throw (libstdc++.so.6 + 0xae708)
#7 0x000056031e19de32 _ZN15WallpaperEngine6Assets18CCombinedContainer8readFileENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPj.cold (wallengine + 0x1ce32)
#8 0x000056031e1c5873 _ZN15WallpaperEngine10FileSystem12loadFullFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x44873)
#9 0x000056031e1cb49c _ZN15WallpaperEngine4Core6CScene8fromFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x4a49c)
#10 0x000056031e1cc3d7 _ZN15WallpaperEngine4Core8CProject8fromFileERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPNS_6Assets10CContainerE (wallengine + 0x4b3d7)
#11 0x000056031e1a3f59 main (wallengine + 0x22f59)
#12 0x00007f4fc5a29550 __libc_start_call_main (libc.so.6 + 0x29550)
#13 0x00007f4fc5a29609 __libc_start_main@@GLIBC_2.34 (libc.so.6 + 0x29609)
#14 0x000056031e1a56c5 _start (wallengine + 0x246c5)
Stack trace of thread 970420:
#0 0x00007f4fc5ad94f7 getdents64 (libc.so.6 + 0xd94f7)
#1 0x00007f4fc5ad95a5 readdir64 (libc.so.6 + 0xd95a5)
#2 0x00007f4fad8ba50d is_two_character_sub_directory (radeonsi_dri.so + 0xba50d)
#3 0x00007f4fad8ba668 choose_lru_file_matching (radeonsi_dri.so + 0xba668)
#4 0x00007f4fad8bae33 disk_cache_evict_lru_item (radeonsi_dri.so + 0xbae33)
#5 0x00007f4fad8b9cbd cache_put (radeonsi_dri.so + 0xb9cbd)
#6 0x00007f4fad8c19c4 util_queue_thread_func (radeonsi_dri.so + 0xc19c4)
#7 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#8 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#9 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970419:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970421:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970425:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970426:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970428:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970427:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970433:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970424:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970439:
#0 0x00007f4fc5b05cee ppoll (libc.so.6 + 0x105cee)
#1 0x00007f4fa402d341 pa_mainloop_poll (libpulse.so.0 + 0x1e341)
#2 0x00007f4fa4037a5a pa_mainloop_iterate (libpulse.so.0 + 0x28a5a)
#3 0x00007f4fa4037b00 pa_mainloop_run (libpulse.so.0 + 0x28b00)
#4 0x00007f4fbc9380f3 HotplugThread (libSDL2-2.0.so.0 + 0xec0f3)
#5 0x00007f4fbc9799ef RunThread.lto_priv.0 (libSDL2-2.0.so.0 + 0x12d9ef)
#6 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#7 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970431:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970422:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970437:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970423:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970435:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970429:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970434:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970438:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970436:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970432:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
Stack trace of thread 970430:
#0 0x00007f4fc5a89a19 __futex_abstimed_wait_common (libc.so.6 + 0x89a19)
#1 0x00007f4fc5a8c210 pthread_cond_wait@@GLIBC_2.3.2 (libc.so.6 + 0x8c210)
#2 0x00007f4fad8c18fb util_queue_thread_func (radeonsi_dri.so + 0xc18fb)
#3 0x00007f4fad8c154b impl_thrd_routine (radeonsi_dri.so + 0xc154b)
#4 0x00007f4fc5a8ce2d start_thread (libc.so.6 + 0x8ce2d)
#5 0x00007f4fc5b12620 __clone3 (libc.so.6 + 0x112620)
ELF object binary architecture: AMD x86-64
GNU gdb (GDB) Fedora 12.1-1.fc36
Copyright (C) 2022 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Type "show copying" and "show warranty" for details.
This GDB was configured as "x86_64-redhat-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
<https://www.gnu.org/software/gdb/bugs/>.
Find the GDB manual and other documentation resources online at:
<http://www.gnu.org/software/gdb/documentation/>.
For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from /usr/bin/wallengine...
Reading symbols from /usr/lib/debug/usr/bin/wallengine-0.0.1^20220610.ec6164c-1.fc36.x86_64.debug...
warning: Can't open file /memfd:pulseaudio (deleted) during file-backed mapping note processing
warning: Can't open file /memfd:xshmfence (deleted) during file-backed mapping note processing
[New LWP 970418]
[New LWP 970420]
[New LWP 970419]
[New LWP 970421]
[New LWP 970425]
[New LWP 970426]
[New LWP 970428]
[New LWP 970427]
[New LWP 970433]
[New LWP 970424]
[New LWP 970439]
[New LWP 970431]
[New LWP 970422]
[New LWP 970437]
[New LWP 970423]
[New LWP 970435]
[New LWP 970429]
[New LWP 970434]
[New LWP 970438]
[New LWP 970436]
[New LWP 970432]
[New LWP 970430]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
--Type <RET> for more, q to quit, c to continue without paging--c
Core was generated by `wallengine 2270452711'.
Program terminated with signal SIGABRT, Aborted.
#0 __pthread_kill_implementation (threadid=<optimized out>, signo=signo@entry=6, no_tid=no_tid@entry=0) at pthread_kill.c:44
44 return INTERNAL_SYSCALL_ERROR_P (ret) ? INTERNAL_SYSCALL_ERRNO (ret) : 0;
[Current thread is 1 (Thread 0x7f4fbe5ef3c0 (LWP 970418))]
(gdb)
```
**Desktop (please complete the following information):**
- OS: Fedora 36
- Desktop Environment: KDE
- Window Manager: KWin
|
process
|
wallpaper engine background s console output z wallengine no scene pkg file found at home tarulia local share steam steamapps workshop content defaulting to normal folder storage found wallpaper engine s assets at home tarulia local share steam steamapps common wallpaper engine assets terminate called after throwing an instance of wallpaperengine assets cassetloadexception what cannot find file gifscene json cannot find file in any of the containers zsh iot instruction core dumped wallengine z coredumpctl debug pid wallengine uid tarulia gid tarulia signal abrt timestamp thu cest ago command line wallengine executable usr bin wallengine control group user slice user slice user service app slice app konsole scope unit user service user unit app konsole scope slice user slice owner uid tarulia boot id machine id hostname fedora storage var lib systemd coredump core wallengine zst present disk size message process wallengine of user dumped core module linux vdso so with build id module libflac so with build id metadata for module libflac so owned by fdo found type rpm name flac version architecture oscpe cpe o fedoraproject fedora module libasyncns so with build id metadata for module libasyncns so owned by fdo found type rpm name libasyncns version architecture oscpe cpe o fedoraproject fedora module libsndfile so with build id metadata for module libsndfile so owned by fdo found type rpm name libsndfile version architecture oscpe cpe o fedoraproject fedora module libpulsecommon so with build id metadata for module libpulsecommon so owned by fdo found type rpm name pulseaudio version architecture oscpe cpe o fedoraproject fedora module libpulse so with build id metadata for module libpulse so owned by fdo found type rpm name pulseaudio version architecture oscpe cpe o fedoraproject fedora module libpulse simple so with build id metadata for module libpulse simple so owned by fdo found type rpm name pulseaudio version architecture oscpe cpe o fedoraproject fedora module libdbus so with build id metadata for module libdbus so owned by fdo found type rpm name dbus version architecture oscpe cpe o fedoraproject fedora module libtinfo so with build id metadata for module libtinfo so owned by fdo found type rpm name ncurses version architecture oscpe cpe o fedoraproject fedora module libedit so with build id stack trace of thread pthread kill implementation libc so raise libc so abort libc so gnu verbose terminate handlerev cold libstdc so terminateepfvve libstdc so libstdc so cxa throw libstdc so traitsicesaiceeepj cold wallengine traitsicesaiceeepns wallengine traitsicesaiceeepns wallengine traitsicesaiceeepns wallengine main wallengine libc start call main libc so libc start main glibc libc so start wallengine stack trace of thread libc so libc so is two character sub directory radeonsi dri so choose lru file matching radeonsi dri so disk cache evict lru item radeonsi dri so cache put radeonsi dri so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread ppoll libc so pa mainloop poll libpulse so pa mainloop iterate libpulse so pa mainloop run libpulse so hotplugthread so runthread lto priv so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so stack trace of thread futex abstimed wait common libc so pthread cond wait glibc libc so util queue thread func radeonsi dri so impl thrd routine radeonsi dri so start thread libc so libc so elf object binary architecture amd gnu gdb gdb fedora copyright c free software foundation inc license gnu gpl version or later this is free software you are free to change and redistribute it there is no warranty to the extent permitted by law type show copying and show warranty for details this gdb was configured as redhat linux gnu type show configuration for configuration details for bug reporting instructions please see find the gdb manual and other documentation resources online at for help type help type apropos word to search for commands related to word reading symbols from usr bin wallengine reading symbols from usr lib debug usr bin wallengine debug warning can t open file memfd pulseaudio deleted during file backed mapping note processing warning can t open file memfd xshmfence deleted during file backed mapping note processing using host libthread db library libthread db so type for more q to quit c to continue without paging c core was generated by wallengine program terminated with signal sigabrt aborted pthread kill implementation threadid signo signo entry no tid no tid entry at pthread kill c return internal syscall error p ret internal syscall errno ret gdb desktop please complete the following information os fedora desktop environment kde window manager kwin
| 1
|
22,064
| 30,586,802,228
|
IssuesEvent
|
2023-07-21 13:58:34
|
scverse/anndata
|
https://api.github.com/repos/scverse/anndata
|
opened
|
Set up GPU CI
|
enhancement dev process
|
### Please describe your wishes and possible alternatives to achieve the desired result.
What does GPU CI need?
- [ ] `import anndata, cupy; cupyx.scipy.sparse.random(100, 50, format="csr")`
- [ ] Pytest activation mark, only run gpu tests
- [ ] When does this run? Start with every time?
- [ ] How does the secret code work?
- It is managed through cirun
- [ ] How do images work? https://aws.amazon.com/marketplace/pp/prodview-7ikjtg3um26wq?sr=0-1&ref_=beagle&applicationId=AWS-EC2-Console
- Do we need to do our own image? @Zethson is following up with cirun. Could be the reason the test can take a while to start.
Will be partially solved by: #1066
cc @Intron7
|
1.0
|
Set up GPU CI - ### Please describe your wishes and possible alternatives to achieve the desired result.
What does GPU CI need?
- [ ] `import anndata, cupy; cupyx.scipy.sparse.random(100, 50, format="csr")`
- [ ] Pytest activation mark, only run gpu tests
- [ ] When does this run? Start with every time?
- [ ] How does the secret code work?
- It is managed through cirun
- [ ] How do images work? https://aws.amazon.com/marketplace/pp/prodview-7ikjtg3um26wq?sr=0-1&ref_=beagle&applicationId=AWS-EC2-Console
- Do we need to do our own image? @Zethson is following up with cirun. Could be the reason the test can take a while to start.
Will be partially solved by: #1066
cc @Intron7
|
process
|
set up gpu ci please describe your wishes and possible alternatives to achieve the desired result what does gpu ci need import anndata cupy cupyx scipy sparse random format csr pytest activation mark only run gpu tests when does this run start with every time how does the secret code work it is managed through cirun how do images work do we need to do our own image zethson is following up with cirun could be the reason the test can take a while to start will be partially solved by cc
| 1
|
14,124
| 17,019,216,125
|
IssuesEvent
|
2021-07-02 16:09:55
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
closed
|
DB pull & migration doesn't work for MongoDB
|
process/candidate team/migrations topic: mongodb
|
Hi Prisma Team! Prisma Migrate just crashed.
## Command
`db pull`
## Versions
| Name | Version |
|-------------|--------------------|
| Platform | debian-openssl-1.1.x|
| Node | v14.15.5 |
| Prisma CLI | 2.25.0 |
| Binary | c838e79f39885bc8e1611849b1eb28b5bb5bc922|
## Error
```
Error: [/root/.cargo/git/checkouts/quaint-9f01e008b9a89c14/ed93633/src/single.rs:159:18] not implemented: Supported url schemes: file or sqlite, mysql, postgresql or jdbc:sqlserver.
```
|
1.0
|
DB pull & migration doesn't work for MongoDB - Hi Prisma Team! Prisma Migrate just crashed.
## Command
`db pull`
## Versions
| Name | Version |
|-------------|--------------------|
| Platform | debian-openssl-1.1.x|
| Node | v14.15.5 |
| Prisma CLI | 2.25.0 |
| Binary | c838e79f39885bc8e1611849b1eb28b5bb5bc922|
## Error
```
Error: [/root/.cargo/git/checkouts/quaint-9f01e008b9a89c14/ed93633/src/single.rs:159:18] not implemented: Supported url schemes: file or sqlite, mysql, postgresql or jdbc:sqlserver.
```
|
process
|
db pull migration doesn t work for mongodb hi prisma team prisma migrate just crashed command db pull versions name version platform debian openssl x node prisma cli binary error error not implemented supported url schemes file or sqlite mysql postgresql or jdbc sqlserver
| 1
|
4,646
| 7,494,789,871
|
IssuesEvent
|
2018-04-07 14:03:06
|
ODiogoSilva/assemblerflow
|
https://api.github.com/repos/ODiogoSilva/assemblerflow
|
closed
|
Add nextflow best practices to initial channel definition
|
enhancement process
|
The creation of channels should follow a standardised practice and include sanity checks on the ``Process`` class definition
|
1.0
|
Add nextflow best practices to initial channel definition - The creation of channels should follow a standardised practice and include sanity checks on the ``Process`` class definition
|
process
|
add nextflow best practices to initial channel definition the creation of channels should follow a standardised practice and include sanity checks on the process class definition
| 1
|
77,276
| 9,986,394,452
|
IssuesEvent
|
2019-07-10 18:59:10
|
zcash/zcash
|
https://api.github.com/repos/zcash/zcash
|
closed
|
Unified documentation for zcash-cli
|
documentation
|
There seems to be general confusion regarding Zcash and upstream RPC. It would be nice to have unified documentation of these calls - extending the payment api doc: https://github.com/zcash/zcash/blob/v1.0.0-beta1/doc/payment-api.md
|
1.0
|
Unified documentation for zcash-cli - There seems to be general confusion regarding Zcash and upstream RPC. It would be nice to have unified documentation of these calls - extending the payment api doc: https://github.com/zcash/zcash/blob/v1.0.0-beta1/doc/payment-api.md
|
non_process
|
unified documentation for zcash cli there seems to be general confusion regarding zcash and upstream rpc it would be nice to have unified documentation of these calls extending the payment api doc
| 0
|
1,913
| 4,750,772,346
|
IssuesEvent
|
2016-10-22 14:36:05
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
Search fails on number 0
|
bug inprocess
|
When binding to a column with a value of 0 and then typing 0 in the search box doesn't properly return the intended results.
TableDataStore.js
line 489
if (this.colInfos[key] && row[key]) {
This returns 0 (which is false)
Converting the data first to a string gets around the issue, but not the desired solution.
|
1.0
|
Search fails on number 0 - When binding to a column with a value of 0 and then typing 0 in the search box doesn't properly return the intended results.
TableDataStore.js
line 489
if (this.colInfos[key] && row[key]) {
This returns 0 (which is false)
Converting the data first to a string gets around the issue, but not the desired solution.
|
process
|
search fails on number when binding to a column with a value of and then typing in the search box doesn t properly return the intended results tabledatastore js line if this colinfos row this returns which is false converting the data first to a string gets around the issue but not the desired solution
| 1
|
13,263
| 15,729,960,401
|
IssuesEvent
|
2021-03-29 15:23:51
|
kubeflow/pipelines
|
https://api.github.com/repos/kubeflow/pipelines
|
closed
|
[frontend] Upgrade Argo UI template to match original content
|
area/frontend kind/process
|
We have upgraded backend to use newer Argo version `2.12.9`:https://github.com/kubeflow/pipelines/pull/5266. Based on https://github.com/kubeflow/pipelines/pull/5339#discussion_r598079077, we need to make sure we are also updating other contents in `/kubeflow/pipelines/frontend/third_party/argo-ui/argo_template.ts` to match https://github.com/argoproj/argo-workflows/blob/80b5ab9b8e35b4dba71396062abe32918cd76ddd/ui/src/models/workflows.ts#L863-L873.
|
1.0
|
[frontend] Upgrade Argo UI template to match original content - We have upgraded backend to use newer Argo version `2.12.9`:https://github.com/kubeflow/pipelines/pull/5266. Based on https://github.com/kubeflow/pipelines/pull/5339#discussion_r598079077, we need to make sure we are also updating other contents in `/kubeflow/pipelines/frontend/third_party/argo-ui/argo_template.ts` to match https://github.com/argoproj/argo-workflows/blob/80b5ab9b8e35b4dba71396062abe32918cd76ddd/ui/src/models/workflows.ts#L863-L873.
|
process
|
upgrade argo ui template to match original content we have upgraded backend to use newer argo version based on we need to make sure we are also updating other contents in kubeflow pipelines frontend third party argo ui argo template ts to match
| 1
|
118,726
| 11,987,091,377
|
IssuesEvent
|
2020-04-07 20:32:36
|
gravitational/gravity
|
https://api.github.com/repos/gravitational/gravity
|
opened
|
Validate RHEL 7.8
|
backport/5.5 backport/6.1 backport/6.3 documentation enhancement priority/1 support-load
|
RHEL 7.8 is out and we should see if our active Gravity versions can install there and add it to our docs.
|
1.0
|
Validate RHEL 7.8 - RHEL 7.8 is out and we should see if our active Gravity versions can install there and add it to our docs.
|
non_process
|
validate rhel rhel is out and we should see if our active gravity versions can install there and add it to our docs
| 0
|
19,884
| 26,329,641,305
|
IssuesEvent
|
2023-01-10 09:49:26
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
protein quality control issues
|
low priority PomBase protein processing and quality control
|
~I am looking for
chaperone-mediated (protein) refolding
I can find
GO:0061077 chaperone-mediated protein folding
bout the 'refolding' child is called
GO:0051085 chaperone cofactor-dependent protein refolding
(I don't know why the process needs to specify whether a cofacter is present or not. How would we know?)
There are other strange children here
GO:0061992 ATP-dependent chaperone mediated protein folding | is_a
GO:1990507 ATP-independent chaperone mediated protein folding | is_a
GO:0051086 chaperone mediated protein folding independent of cofactor
should these merge? whether something is ATP dependent should be part of the activity of a specific gene product rather thanthe process (which could not be known at this stage).
For example GO:1990507 has 4 annotation (2 genes), I think this just meant this particular chaperone does not require ATP?~
out of date
|
1.0
|
protein quality control issues - ~I am looking for
chaperone-mediated (protein) refolding
I can find
GO:0061077 chaperone-mediated protein folding
bout the 'refolding' child is called
GO:0051085 chaperone cofactor-dependent protein refolding
(I don't know why the process needs to specify whether a cofacter is present or not. How would we know?)
There are other strange children here
GO:0061992 ATP-dependent chaperone mediated protein folding | is_a
GO:1990507 ATP-independent chaperone mediated protein folding | is_a
GO:0051086 chaperone mediated protein folding independent of cofactor
should these merge? whether something is ATP dependent should be part of the activity of a specific gene product rather thanthe process (which could not be known at this stage).
For example GO:1990507 has 4 annotation (2 genes), I think this just meant this particular chaperone does not require ATP?~
out of date
|
process
|
protein quality control issues i am looking for chaperone mediated protein refolding i can find go chaperone mediated protein folding bout the refolding child is called go chaperone cofactor dependent protein refolding i don t know why the process needs to specify whether a cofacter is present or not how would we know there are other strange children here go atp dependent chaperone mediated protein folding is a go atp independent chaperone mediated protein folding is a go chaperone mediated protein folding independent of cofactor should these merge whether something is atp dependent should be part of the activity of a specific gene product rather thanthe process which could not be known at this stage for example go has annotation genes i think this just meant this particular chaperone does not require atp out of date
| 1
|
157,848
| 12,393,412,552
|
IssuesEvent
|
2020-05-20 15:23:03
|
nedroden/Artemis
|
https://api.github.com/repos/nedroden/Artemis
|
opened
|
Registration
|
feature testing required
|
Has already been implemented, but should be tested further before this can be considered 'done'. As of 5/20/2020, the following things still need to be done:
* Write more unit tests
* Manual testing
* Improve UX/process where possible
|
1.0
|
Registration - Has already been implemented, but should be tested further before this can be considered 'done'. As of 5/20/2020, the following things still need to be done:
* Write more unit tests
* Manual testing
* Improve UX/process where possible
|
non_process
|
registration has already been implemented but should be tested further before this can be considered done as of the following things still need to be done write more unit tests manual testing improve ux process where possible
| 0
|
16,347
| 21,004,329,568
|
IssuesEvent
|
2022-03-29 20:46:30
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
`update-leaderboard` GitHub Action failing
|
priority/important-longterm lifecycle/stale kind/process
|
Logs: https://github.com/kubernetes/minikube/actions/runs/1426448148/attempts/1
Error:
` ! [rejected] v1.24.0 -> v1.24.0 (would clobber existing tag)`
Caused by line:
https://github.com/kubernetes/minikube/blob/9abb3b5bac3f0c95ba2a1ad3ebe9a96093c5451f/hack/update_contributions.sh#L54
I thought I fixed it with https://github.com/kubernetes/minikube/pull/12563 but still continuing to fail.
Most likely due to the release tag being modified by other release script since the job started.
|
1.0
|
`update-leaderboard` GitHub Action failing - Logs: https://github.com/kubernetes/minikube/actions/runs/1426448148/attempts/1
Error:
` ! [rejected] v1.24.0 -> v1.24.0 (would clobber existing tag)`
Caused by line:
https://github.com/kubernetes/minikube/blob/9abb3b5bac3f0c95ba2a1ad3ebe9a96093c5451f/hack/update_contributions.sh#L54
I thought I fixed it with https://github.com/kubernetes/minikube/pull/12563 but still continuing to fail.
Most likely due to the release tag being modified by other release script since the job started.
|
process
|
update leaderboard github action failing logs error would clobber existing tag caused by line i thought i fixed it with but still continuing to fail most likely due to the release tag being modified by other release script since the job started
| 1
|
722,658
| 24,870,578,156
|
IssuesEvent
|
2022-10-27 14:57:30
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
Bluetooth: Controller: Transmits packets longer than configured max len
|
bug priority: medium area: Bluetooth area: Bluetooth Controller area: Bluetooth LLCP
|
**Describe the bug**
The fix in #51624 for #51600 exposed the following babblesim CI failure:
```
tests/bluetooth/bsim_bt/bsim_test_gatt_write/tests_scripts/gatt_write.sh FAILED
d_00: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-g7231c2d008f3 ***
d_01: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-g7231c2d008f3 ***
d_00: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Advertising successfully started
d_00: @00:00:00.003368 start_scan: Scanning successfully started
d_00: @00:00:00.318347 [DEVICE]: ED:71:8F:C2:E4:6E (random), AD evt type 0, AD data len 3, RSSI -35
d_01: @00:00:00.424869 Updated MTU: TX: 23 RX: 23 bytes
d_01: @00:00:00.424869 connected: ED:3B:20:15:18:12 (random) role 1
d_01: @00:00:00.424869 mtu_exchange: Current MTU = 23
d_01: @00:00:00.424869 mtu_exchange: Exchange MTU...
d_00: @00:00:00.424866 Updated MTU: TX: 23 RX: 23 bytes
d_00: @00:00:00.424866 connected: ED:71:8F:C2:E4:6E (random) role 0
d_00: @00:00:00.424866 mtu_exchange: Current MTU = 23
d_00: @00:00:00.424866 mtu_exchange: Exchange MTU...
d_01: @00:00:00.476806 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.477712 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:00.526861 Updated MTU: TX: 247 RX: 247 bytes
d_01: @00:00:00.526861 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:01.076429 ERROR: (src/bs_pc_2G4_stateless_wo_callbacks.c:64): Tried to request a new tx while some other transaction was ongoing
d_00: @00:00:01.076389 main: The TESTCASE FAILED with return code 1
d_01: @00:00:01.076429 main: The TESTCASE FAILED with return code 1
```
With updated Babblesim ext_NRF52_hw_model the error is:
```
d_00: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-gd05957d5e3d4 ***
d_01: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-gd05957d5e3d4 ***
d_01: @00:00:00.002648 Bluetooth initialized
d_00: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Advertising successfully started
d_00: @00:00:00.003368 start_scan: Scanning successfully started
d_00: @00:00:00.318347 [DEVICE]: ED:71:8F:C2:E4:6E (random), AD evt type 0, AD data len 3, RSSI -35
d_01: @00:00:00.424869 Updated MTU: TX: 23 RX: 23 bytes
d_01: @00:00:00.424869 connected: ED:3B:20:15:18:12 (random) role 1
d_01: @00:00:00.424869 mtu_exchange: Current MTU = 23
d_01: @00:00:00.424869 mtu_exchange: Exchange MTU...
d_00: @00:00:00.424866 Updated MTU: TX: 23 RX: 23 bytes
d_00: @00:00:00.424866 connected: ED:71:8F:C2:E4:6E (random) role 0
d_00: @00:00:00.424866 mtu_exchange: Current MTU = 23
d_00: @00:00:00.424866 mtu_exchange: Exchange MTU...
d_01: @00:00:00.476806 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.477712 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:00.526861 Updated MTU: TX: 247 RX: 247 bytes
d_01: @00:00:00.526861 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:01.076092 ERROR: (WEST_TOPDIR/modules/bsim_hw_models/nrf_hw_models/src/HW_models/NRF_RADIO.c:918): NRF_RADIO: received a packet longer than the configured max lenght (251>27), this is not yet handled in this models. I stop before it gets confusing
d_01: @00:00:01.076092 main: The TESTCASE FAILED with return code 1
d_00: @00:00:01.076127 main: The TESTCASE FAILED with return code 1
```
This indicates that the central device is transmitting larger packet while peripheral device has not yet configured the radio to accept the larger packet.
This could be indicating a bug in the state machine transition or its implementation that is used to configure the radio to change to new Rx length octets.
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
Issue encounter in BabbleSim CI testing of #51624 and #50292
- What have you tried to diagnose or workaround this issue?
As part of debugging @aescolar helped add error in NRF52 hw model to expose the Tx Data Length violation using: https://github.com/BabbleSim/ext_NRF52_hw_models/commit/b53911db99c8219f3faafc5a4fcbe2f4f63fccf2
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
This is not a regression for refactored LLCP implementation, but possibly a bug. The issue does not exist in legacy control procedure implementation. The test does not fail when using `CONFIG_BT_LL_SW_LLCP_LEGACY=y`. But does not mean it was not present, probably the implementation did not happens to cause the transmission to be larger before the peer has configured for larger reception.
**To Reproduce**
Steps to reproduce the behavior:
1. `sh tests/bluetooth/bsim_bt/compile.sh`
2. `sh tests/bluetooth/bsim_bt/bsim_bt_gatt_write/tests_scripts/gatt_write.sh`
**Expected behavior**
The `gatt_write.sh` BabbleSim CI test passes with error.
**Impact**
showstopper, blocks #51624 and hence #51600
**Logs and console output**
As in the description.
**Environment (please complete the following information):**
- OS: Linux
- Toolchain: Zephyr SDK
- Commit SHA or Version used: d05957d5e3d44aec1276a8968a943f8116e42514
**Additional context**
None
|
1.0
|
Bluetooth: Controller: Transmits packets longer than configured max len - **Describe the bug**
The fix in #51624 for #51600 exposed the following babblesim CI failure:
```
tests/bluetooth/bsim_bt/bsim_test_gatt_write/tests_scripts/gatt_write.sh FAILED
d_00: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-g7231c2d008f3 ***
d_01: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-g7231c2d008f3 ***
d_00: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Advertising successfully started
d_00: @00:00:00.003368 start_scan: Scanning successfully started
d_00: @00:00:00.318347 [DEVICE]: ED:71:8F:C2:E4:6E (random), AD evt type 0, AD data len 3, RSSI -35
d_01: @00:00:00.424869 Updated MTU: TX: 23 RX: 23 bytes
d_01: @00:00:00.424869 connected: ED:3B:20:15:18:12 (random) role 1
d_01: @00:00:00.424869 mtu_exchange: Current MTU = 23
d_01: @00:00:00.424869 mtu_exchange: Exchange MTU...
d_00: @00:00:00.424866 Updated MTU: TX: 23 RX: 23 bytes
d_00: @00:00:00.424866 connected: ED:71:8F:C2:E4:6E (random) role 0
d_00: @00:00:00.424866 mtu_exchange: Current MTU = 23
d_00: @00:00:00.424866 mtu_exchange: Exchange MTU...
d_01: @00:00:00.476806 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.477712 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:00.526861 Updated MTU: TX: 247 RX: 247 bytes
d_01: @00:00:00.526861 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:01.076429 ERROR: (src/bs_pc_2G4_stateless_wo_callbacks.c:64): Tried to request a new tx while some other transaction was ongoing
d_00: @00:00:01.076389 main: The TESTCASE FAILED with return code 1
d_01: @00:00:01.076429 main: The TESTCASE FAILED with return code 1
```
With updated Babblesim ext_NRF52_hw_model the error is:
```
d_00: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-gd05957d5e3d4 ***
d_01: @00:00:00.000000 *** Booting Zephyr OS build zephyr-v3.2.0-915-gd05957d5e3d4 ***
d_01: @00:00:00.002648 Bluetooth initialized
d_00: @00:00:00.002648 Bluetooth initialized
d_01: @00:00:00.002648 Advertising successfully started
d_00: @00:00:00.003368 start_scan: Scanning successfully started
d_00: @00:00:00.318347 [DEVICE]: ED:71:8F:C2:E4:6E (random), AD evt type 0, AD data len 3, RSSI -35
d_01: @00:00:00.424869 Updated MTU: TX: 23 RX: 23 bytes
d_01: @00:00:00.424869 connected: ED:3B:20:15:18:12 (random) role 1
d_01: @00:00:00.424869 mtu_exchange: Current MTU = 23
d_01: @00:00:00.424869 mtu_exchange: Exchange MTU...
d_00: @00:00:00.424866 Updated MTU: TX: 23 RX: 23 bytes
d_00: @00:00:00.424866 connected: ED:71:8F:C2:E4:6E (random) role 0
d_00: @00:00:00.424866 mtu_exchange: Current MTU = 23
d_00: @00:00:00.424866 mtu_exchange: Exchange MTU...
d_01: @00:00:00.476806 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.477712 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 Updated MTU: TX: 247 RX: 247 bytes
d_00: @00:00:00.526575 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:00.526861 Updated MTU: TX: 247 RX: 247 bytes
d_01: @00:00:00.526861 mtu_exchange_cb: MTU exchange successful (247)
d_01: @00:00:01.076092 ERROR: (WEST_TOPDIR/modules/bsim_hw_models/nrf_hw_models/src/HW_models/NRF_RADIO.c:918): NRF_RADIO: received a packet longer than the configured max lenght (251>27), this is not yet handled in this models. I stop before it gets confusing
d_01: @00:00:01.076092 main: The TESTCASE FAILED with return code 1
d_00: @00:00:01.076127 main: The TESTCASE FAILED with return code 1
```
This indicates that the central device is transmitting larger packet while peripheral device has not yet configured the radio to accept the larger packet.
This could be indicating a bug in the state machine transition or its implementation that is used to configure the radio to change to new Rx length octets.
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
Issue encounter in BabbleSim CI testing of #51624 and #50292
- What have you tried to diagnose or workaround this issue?
As part of debugging @aescolar helped add error in NRF52 hw model to expose the Tx Data Length violation using: https://github.com/BabbleSim/ext_NRF52_hw_models/commit/b53911db99c8219f3faafc5a4fcbe2f4f63fccf2
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
This is not a regression for refactored LLCP implementation, but possibly a bug. The issue does not exist in legacy control procedure implementation. The test does not fail when using `CONFIG_BT_LL_SW_LLCP_LEGACY=y`. But does not mean it was not present, probably the implementation did not happens to cause the transmission to be larger before the peer has configured for larger reception.
**To Reproduce**
Steps to reproduce the behavior:
1. `sh tests/bluetooth/bsim_bt/compile.sh`
2. `sh tests/bluetooth/bsim_bt/bsim_bt_gatt_write/tests_scripts/gatt_write.sh`
**Expected behavior**
The `gatt_write.sh` BabbleSim CI test passes with error.
**Impact**
showstopper, blocks #51624 and hence #51600
**Logs and console output**
As in the description.
**Environment (please complete the following information):**
- OS: Linux
- Toolchain: Zephyr SDK
- Commit SHA or Version used: d05957d5e3d44aec1276a8968a943f8116e42514
**Additional context**
None
|
non_process
|
bluetooth controller transmits packets longer than configured max len describe the bug the fix in for exposed the following babblesim ci failure tests bluetooth bsim bt bsim test gatt write tests scripts gatt write sh failed d booting zephyr os build zephyr d booting zephyr os build zephyr d bluetooth initialized d bluetooth initialized d advertising successfully started d start scan scanning successfully started d ed random ad evt type ad data len rssi d updated mtu tx rx bytes d connected ed random role d mtu exchange current mtu d mtu exchange exchange mtu d updated mtu tx rx bytes d connected ed random role d mtu exchange current mtu d mtu exchange exchange mtu d updated mtu tx rx bytes d updated mtu tx rx bytes d updated mtu tx rx bytes d mtu exchange cb mtu exchange successful d updated mtu tx rx bytes d mtu exchange cb mtu exchange successful d error src bs pc stateless wo callbacks c tried to request a new tx while some other transaction was ongoing d main the testcase failed with return code d main the testcase failed with return code with updated babblesim ext hw model the error is d booting zephyr os build zephyr d booting zephyr os build zephyr d bluetooth initialized d bluetooth initialized d advertising successfully started d start scan scanning successfully started d ed random ad evt type ad data len rssi d updated mtu tx rx bytes d connected ed random role d mtu exchange current mtu d mtu exchange exchange mtu d updated mtu tx rx bytes d connected ed random role d mtu exchange current mtu d mtu exchange exchange mtu d updated mtu tx rx bytes d updated mtu tx rx bytes d updated mtu tx rx bytes d mtu exchange cb mtu exchange successful d updated mtu tx rx bytes d mtu exchange cb mtu exchange successful d error west topdir modules bsim hw models nrf hw models src hw models nrf radio c nrf radio received a packet longer than the configured max lenght this is not yet handled in this models i stop before it gets confusing d main the testcase failed with return code d main the testcase failed with return code this indicates that the central device is transmitting larger packet while peripheral device has not yet configured the radio to accept the larger packet this could be indicating a bug in the state machine transition or its implementation that is used to configure the radio to change to new rx length octets please also mention any information which could help others to understand the problem you re facing what target platform are you using issue encounter in babblesim ci testing of and what have you tried to diagnose or workaround this issue as part of debugging aescolar helped add error in hw model to expose the tx data length violation using is this a regression if yes have you been able to git bisect it to a specific commit this is not a regression for refactored llcp implementation but possibly a bug the issue does not exist in legacy control procedure implementation the test does not fail when using config bt ll sw llcp legacy y but does not mean it was not present probably the implementation did not happens to cause the transmission to be larger before the peer has configured for larger reception to reproduce steps to reproduce the behavior sh tests bluetooth bsim bt compile sh sh tests bluetooth bsim bt bsim bt gatt write tests scripts gatt write sh expected behavior the gatt write sh babblesim ci test passes with error impact showstopper blocks and hence logs and console output as in the description environment please complete the following information os linux toolchain zephyr sdk commit sha or version used additional context none
| 0
|
6,248
| 9,206,068,183
|
IssuesEvent
|
2019-03-08 12:36:59
|
astrolabsoftware/fink-broker
|
https://api.github.com/repos/astrolabsoftware/fink-broker
|
opened
|
Clasification: case for small number of alerts
|
processing services
|
In order to perform early classification of the stream (live), we are batching several alerts in tables, and using the xMatch service from the CDS (). This service is powerful to cross-match large catalogs, but it might not be the wisest choice for small number of alerts. Therefore I would advocate to inspect the length of the table to be sent, and if the number of alerts is smaller than some threshold (to be defined), we would use `VO Simple Cone Search` (https://astroquery.readthedocs.io/en/latest/vo_conesearch/vo_conesearch.html) instead of `xMatch`.
|
1.0
|
Clasification: case for small number of alerts - In order to perform early classification of the stream (live), we are batching several alerts in tables, and using the xMatch service from the CDS (). This service is powerful to cross-match large catalogs, but it might not be the wisest choice for small number of alerts. Therefore I would advocate to inspect the length of the table to be sent, and if the number of alerts is smaller than some threshold (to be defined), we would use `VO Simple Cone Search` (https://astroquery.readthedocs.io/en/latest/vo_conesearch/vo_conesearch.html) instead of `xMatch`.
|
process
|
clasification case for small number of alerts in order to perform early classification of the stream live we are batching several alerts in tables and using the xmatch service from the cds this service is powerful to cross match large catalogs but it might not be the wisest choice for small number of alerts therefore i would advocate to inspect the length of the table to be sent and if the number of alerts is smaller than some threshold to be defined we would use vo simple cone search instead of xmatch
| 1
|
11,683
| 14,542,402,756
|
IssuesEvent
|
2020-12-15 15:40:26
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
expressions in $[ ] reqire spacing, some example don't have it
|
Pri1 devops-cicd-process/tech devops/prod doc-bug
|
There are examples like `myVar:$[eq(1,2)]` on this page.
However, the expression I have written:
```
$[replace(variables['System.CollectionUri'],'https://dev','https://pkgs.dev')]
```
is not evaluated unless it is written with spaces like this:
```
$[ replace(variables['System.CollectionUri'],'https://dev','https://pkgs.dev') ]
```
This is probabbly true across the board for compile and runtime expressions and all functions.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
expressions in $[ ] reqire spacing, some example don't have it - There are examples like `myVar:$[eq(1,2)]` on this page.
However, the expression I have written:
```
$[replace(variables['System.CollectionUri'],'https://dev','https://pkgs.dev')]
```
is not evaluated unless it is written with spaces like this:
```
$[ replace(variables['System.CollectionUri'],'https://dev','https://pkgs.dev') ]
```
This is probabbly true across the board for compile and runtime expressions and all functions.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
expressions in reqire spacing some example don t have it there are examples like myvar on this page however the expression i have written is not evaluated unless it is written with spaces like this this is probabbly true across the board for compile and runtime expressions and all functions document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
19,862
| 26,271,294,199
|
IssuesEvent
|
2023-01-06 17:14:24
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
opened
|
Use fixup commits during code review
|
RFC Process
|
## Introduction
During code review, when PR authors make changes, they should upload fix-up commits containing those changes instead of changing the existing commits and force-pushing. This will streamline reviewer tasks. Once the review is nominally complete, the reviewer should perform a squash rebase and force-push as usual for merging.
### Problem description
Code reviews may go through several rounds of review, in which the author uploads new code to address previous comments, typically by amending their previous commits. Due to limitations of GitHub, it may not be immediately obvious to reviewers which parts of the uploaded code are new to them, so they need to review any files that have changed, usually looking at the entire delta of the PR again. This is particularly burdensome on large PRs with several active reviewers.
### Proposed change
Authors should avoid pushing amended commits during code review. Instead, they should push fix-up commits addressing reviewer comments. When reviewers are satisfied, authors should squash in the fix-up commits and force-push prior to final approval and merging.
## Detailed RFC
### Background
Git's interactive rebase supports `squash` and `fixup` operations on commits. `squash` combines two or more commits into one and combines their commit messages into one. `fixup` combines the commits into one but only retains the first commit message. "Fix-up commits" or "squash commits" are commits that a user intends to `fixup` or `squash` later, typically indicating the base commit onto which the new commit will be fixed up or squashed, e.g.
```
deadbeef fixup: Declare API # Commit message: Adds comments
2468abab Define API
1234abcd Declare API
```
### Proposed change (Detailed)
* During code review, authors will avoid pushing amended commits and instead push fix-up commits containing their new changes.
* Reviewers will review new commits until they are satisfied that the new commits address the concerns they raised about the previous commits.
* When reviewers have satisfied their concerns, authors will squash the previously created fix-up commits and force-push, retaining the original base commit with `git rebase --keep-base`.
* Reviewers will verify that the author did not make any significant changes during this operation, which should be straightforward in GitHub's compare UI. Reviewers will also verify that the overall commit structure is still to their liking. If those things aren't true, review will continue.
* Reviewers will approve and merge the PR.
### Dependencies
I think the blast radius of this is pretty minimal. We could try it on a few PRs and see how it goes before making any policy changes.
### Concerns and Unresolved Questions
* Git has facilities to automate `fixup` and `squash` operations like so:
```
git commit # Original commit 1234, "Add a feature"
# Fix formatting
git commit --fixup=1234 # Create a fix-up commit
# Now there are two commits
git rebase --autosquash
# Now there is just one commit, "Add a feature," with correct formatting
```
These operations create and rely on commit headlines with the format `fixup: Previous headline` or `squash: Previous headline`.
Authors could use this feature to automate some of the steps. However, this would get in the way of using `--autosquash` locally on that PR for their own purposes. I think it should be fine if authors use `FIXUP: Previous message` or similar for the fix-up commits they want to upload for review. The reviewer experience should be the same either way.
* This process is probably overkill for small PRs of a few lines that are trivial to re-review. I think fix-ups should be _recommended_ for large PRs.
## Alternatives
### Habitually not rebasing
This proposal primarily addresses weaknesses of GitHub's UI for before-after comparison of force pushes. One annoying behavior is that it will show diffs for every file in the repository, not just those changed by the commits. This makes it basically unusable if the author has rebased the PR. Authors could avoid this by only using `git rebase --keep-base` during the review. Reviewers would then be able to use GitHub to see just the relatively small changes that the author made in response to review comments.
However, this would retain other weaknesses of the GitHub push comparison UI:
* It does not show in which commits the new changes will ultimately land.
* It also does not allow the reviewer to comment in the comparison view, so they need to navigate to a different page with more code and find the code they were looking at previously.
* It only works for the delta introduced by the latest push.
|
1.0
|
Use fixup commits during code review - ## Introduction
During code review, when PR authors make changes, they should upload fix-up commits containing those changes instead of changing the existing commits and force-pushing. This will streamline reviewer tasks. Once the review is nominally complete, the reviewer should perform a squash rebase and force-push as usual for merging.
### Problem description
Code reviews may go through several rounds of review, in which the author uploads new code to address previous comments, typically by amending their previous commits. Due to limitations of GitHub, it may not be immediately obvious to reviewers which parts of the uploaded code are new to them, so they need to review any files that have changed, usually looking at the entire delta of the PR again. This is particularly burdensome on large PRs with several active reviewers.
### Proposed change
Authors should avoid pushing amended commits during code review. Instead, they should push fix-up commits addressing reviewer comments. When reviewers are satisfied, authors should squash in the fix-up commits and force-push prior to final approval and merging.
## Detailed RFC
### Background
Git's interactive rebase supports `squash` and `fixup` operations on commits. `squash` combines two or more commits into one and combines their commit messages into one. `fixup` combines the commits into one but only retains the first commit message. "Fix-up commits" or "squash commits" are commits that a user intends to `fixup` or `squash` later, typically indicating the base commit onto which the new commit will be fixed up or squashed, e.g.
```
deadbeef fixup: Declare API # Commit message: Adds comments
2468abab Define API
1234abcd Declare API
```
### Proposed change (Detailed)
* During code review, authors will avoid pushing amended commits and instead push fix-up commits containing their new changes.
* Reviewers will review new commits until they are satisfied that the new commits address the concerns they raised about the previous commits.
* When reviewers have satisfied their concerns, authors will squash the previously created fix-up commits and force-push, retaining the original base commit with `git rebase --keep-base`.
* Reviewers will verify that the author did not make any significant changes during this operation, which should be straightforward in GitHub's compare UI. Reviewers will also verify that the overall commit structure is still to their liking. If those things aren't true, review will continue.
* Reviewers will approve and merge the PR.
### Dependencies
I think the blast radius of this is pretty minimal. We could try it on a few PRs and see how it goes before making any policy changes.
### Concerns and Unresolved Questions
* Git has facilities to automate `fixup` and `squash` operations like so:
```
git commit # Original commit 1234, "Add a feature"
# Fix formatting
git commit --fixup=1234 # Create a fix-up commit
# Now there are two commits
git rebase --autosquash
# Now there is just one commit, "Add a feature," with correct formatting
```
These operations create and rely on commit headlines with the format `fixup: Previous headline` or `squash: Previous headline`.
Authors could use this feature to automate some of the steps. However, this would get in the way of using `--autosquash` locally on that PR for their own purposes. I think it should be fine if authors use `FIXUP: Previous message` or similar for the fix-up commits they want to upload for review. The reviewer experience should be the same either way.
* This process is probably overkill for small PRs of a few lines that are trivial to re-review. I think fix-ups should be _recommended_ for large PRs.
## Alternatives
### Habitually not rebasing
This proposal primarily addresses weaknesses of GitHub's UI for before-after comparison of force pushes. One annoying behavior is that it will show diffs for every file in the repository, not just those changed by the commits. This makes it basically unusable if the author has rebased the PR. Authors could avoid this by only using `git rebase --keep-base` during the review. Reviewers would then be able to use GitHub to see just the relatively small changes that the author made in response to review comments.
However, this would retain other weaknesses of the GitHub push comparison UI:
* It does not show in which commits the new changes will ultimately land.
* It also does not allow the reviewer to comment in the comparison view, so they need to navigate to a different page with more code and find the code they were looking at previously.
* It only works for the delta introduced by the latest push.
|
process
|
use fixup commits during code review introduction during code review when pr authors make changes they should upload fix up commits containing those changes instead of changing the existing commits and force pushing this will streamline reviewer tasks once the review is nominally complete the reviewer should perform a squash rebase and force push as usual for merging problem description code reviews may go through several rounds of review in which the author uploads new code to address previous comments typically by amending their previous commits due to limitations of github it may not be immediately obvious to reviewers which parts of the uploaded code are new to them so they need to review any files that have changed usually looking at the entire delta of the pr again this is particularly burdensome on large prs with several active reviewers proposed change authors should avoid pushing amended commits during code review instead they should push fix up commits addressing reviewer comments when reviewers are satisfied authors should squash in the fix up commits and force push prior to final approval and merging detailed rfc background git s interactive rebase supports squash and fixup operations on commits squash combines two or more commits into one and combines their commit messages into one fixup combines the commits into one but only retains the first commit message fix up commits or squash commits are commits that a user intends to fixup or squash later typically indicating the base commit onto which the new commit will be fixed up or squashed e g deadbeef fixup declare api commit message adds comments define api declare api proposed change detailed during code review authors will avoid pushing amended commits and instead push fix up commits containing their new changes reviewers will review new commits until they are satisfied that the new commits address the concerns they raised about the previous commits when reviewers have satisfied their concerns authors will squash the previously created fix up commits and force push retaining the original base commit with git rebase keep base reviewers will verify that the author did not make any significant changes during this operation which should be straightforward in github s compare ui reviewers will also verify that the overall commit structure is still to their liking if those things aren t true review will continue reviewers will approve and merge the pr dependencies i think the blast radius of this is pretty minimal we could try it on a few prs and see how it goes before making any policy changes concerns and unresolved questions git has facilities to automate fixup and squash operations like so git commit original commit add a feature fix formatting git commit fixup create a fix up commit now there are two commits git rebase autosquash now there is just one commit add a feature with correct formatting these operations create and rely on commit headlines with the format fixup previous headline or squash previous headline authors could use this feature to automate some of the steps however this would get in the way of using autosquash locally on that pr for their own purposes i think it should be fine if authors use fixup previous message or similar for the fix up commits they want to upload for review the reviewer experience should be the same either way this process is probably overkill for small prs of a few lines that are trivial to re review i think fix ups should be recommended for large prs alternatives habitually not rebasing this proposal primarily addresses weaknesses of github s ui for before after comparison of force pushes one annoying behavior is that it will show diffs for every file in the repository not just those changed by the commits this makes it basically unusable if the author has rebased the pr authors could avoid this by only using git rebase keep base during the review reviewers would then be able to use github to see just the relatively small changes that the author made in response to review comments however this would retain other weaknesses of the github push comparison ui it does not show in which commits the new changes will ultimately land it also does not allow the reviewer to comment in the comparison view so they need to navigate to a different page with more code and find the code they were looking at previously it only works for the delta introduced by the latest push
| 1
|
89,442
| 10,599,705,015
|
IssuesEvent
|
2019-10-10 08:33:19
|
kartoza/fbf-project
|
https://api.github.com/repos/kartoza/fbf-project
|
closed
|
Add Building and Roads Maps to Technical Reports
|
documentation
|
Assists Tim with beautiful maps (cartography) for technical docs report examples.
|
1.0
|
Add Building and Roads Maps to Technical Reports - Assists Tim with beautiful maps (cartography) for technical docs report examples.
|
non_process
|
add building and roads maps to technical reports assists tim with beautiful maps cartography for technical docs report examples
| 0
|
2,616
| 5,394,392,534
|
IssuesEvent
|
2017-02-27 02:59:04
|
mitchellh/packer
|
https://api.github.com/repos/mitchellh/packer
|
closed
|
OVF Tool update changes default hashing algorithm
|
enhancement post-processor/vsphere
|
Hello,
This issue is about the latest version of OVF Tool used in the VMware ESX builder. It is now in version 4.2 and the default algorithm changed from SHA1 to SHA256. It is currently not possible to change Packer args used with OVF Tool as far as I know.
The SHA256 is not supported by the vSphere Client thus, it is not possible to had an OVF VM to an ESX with this method. (It is still possible to add them using vSphere web client as I've read but I didn't tested it).
I think it would be a good idea to let users change the parameters they used.
For more details : http://www.virtuallyghetto.com/2016/11/default-hashing-algorithm-changed-in-ovftool-4-2-preventing-ovfova-import-using-vsphere-c-client.html
|
1.0
|
OVF Tool update changes default hashing algorithm - Hello,
This issue is about the latest version of OVF Tool used in the VMware ESX builder. It is now in version 4.2 and the default algorithm changed from SHA1 to SHA256. It is currently not possible to change Packer args used with OVF Tool as far as I know.
The SHA256 is not supported by the vSphere Client thus, it is not possible to had an OVF VM to an ESX with this method. (It is still possible to add them using vSphere web client as I've read but I didn't tested it).
I think it would be a good idea to let users change the parameters they used.
For more details : http://www.virtuallyghetto.com/2016/11/default-hashing-algorithm-changed-in-ovftool-4-2-preventing-ovfova-import-using-vsphere-c-client.html
|
process
|
ovf tool update changes default hashing algorithm hello this issue is about the latest version of ovf tool used in the vmware esx builder it is now in version and the default algorithm changed from to it is currently not possible to change packer args used with ovf tool as far as i know the is not supported by the vsphere client thus it is not possible to had an ovf vm to an esx with this method it is still possible to add them using vsphere web client as i ve read but i didn t tested it i think it would be a good idea to let users change the parameters they used for more details
| 1
|
97,382
| 28,239,434,444
|
IssuesEvent
|
2023-04-06 05:32:35
|
Autodesk/arnold-usd
|
https://api.github.com/repos/Autodesk/arnold-usd
|
closed
|
Support building with MSVC 14.3
|
enhancement build
|
We need to update Scons to be able to build on windows with MSVC 14.3
|
1.0
|
Support building with MSVC 14.3 - We need to update Scons to be able to build on windows with MSVC 14.3
|
non_process
|
support building with msvc we need to update scons to be able to build on windows with msvc
| 0
|
8,283
| 11,447,701,314
|
IssuesEvent
|
2020-02-06 00:39:30
|
scala/community-build
|
https://api.github.com/repos/scala/community-build
|
closed
|
`./narrow` shouldn't modify version-controlled files
|
process
|
think about having `narrow` not modify `projs.conf` directly; it's annoying to have git always thinking it's a change I should check in
perhaps `narrow` could write out a `projs-narrowed.conf` file that would be `.gitignore`d and would be used when present (with `projs.conf` used as a fallback)
also note that after running `narrow` and then running `run.sh`, `dependencies.txt` gets overwritten, too. we should skip doing that if we're in a narrowed state
|
1.0
|
`./narrow` shouldn't modify version-controlled files - think about having `narrow` not modify `projs.conf` directly; it's annoying to have git always thinking it's a change I should check in
perhaps `narrow` could write out a `projs-narrowed.conf` file that would be `.gitignore`d and would be used when present (with `projs.conf` used as a fallback)
also note that after running `narrow` and then running `run.sh`, `dependencies.txt` gets overwritten, too. we should skip doing that if we're in a narrowed state
|
process
|
narrow shouldn t modify version controlled files think about having narrow not modify projs conf directly it s annoying to have git always thinking it s a change i should check in perhaps narrow could write out a projs narrowed conf file that would be gitignore d and would be used when present with projs conf used as a fallback also note that after running narrow and then running run sh dependencies txt gets overwritten too we should skip doing that if we re in a narrowed state
| 1
|
11,779
| 14,613,473,862
|
IssuesEvent
|
2020-12-22 08:18:12
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
[Symfony/Process] setTimeout and setIdleTimeout type hinting inconsistency
|
Process Stalled
|
**Symfony version(s) affected**: 5.0
**Description**
I've implemented a process runner command. While in documentation for `symfony/process` is shown `int` data type, the method signature requires `nullable float`.

**Possible Solution**
1. Either update documentation to specify to use `float` values (e.g.: `3600.0`)
2. Either change type hinting from `?float` to `?int`
Since it's about to be timeout **in seconds**, I'd go with `int`, but `float` brings the flexibility for some cases in which someone wants to put, let's say, 30.5 seconds timeout.
|
1.0
|
[Symfony/Process] setTimeout and setIdleTimeout type hinting inconsistency - **Symfony version(s) affected**: 5.0
**Description**
I've implemented a process runner command. While in documentation for `symfony/process` is shown `int` data type, the method signature requires `nullable float`.

**Possible Solution**
1. Either update documentation to specify to use `float` values (e.g.: `3600.0`)
2. Either change type hinting from `?float` to `?int`
Since it's about to be timeout **in seconds**, I'd go with `int`, but `float` brings the flexibility for some cases in which someone wants to put, let's say, 30.5 seconds timeout.
|
process
|
settimeout and setidletimeout type hinting inconsistency symfony version s affected description i ve implemented a process runner command while in documentation for symfony process is shown int data type the method signature requires nullable float possible solution either update documentation to specify to use float values e g either change type hinting from float to int since it s about to be timeout in seconds i d go with int but float brings the flexibility for some cases in which someone wants to put let s say seconds timeout
| 1
|
395,366
| 11,684,888,309
|
IssuesEvent
|
2020-03-05 07:55:38
|
yalla-coop/death
|
https://api.github.com/repos/yalla-coop/death
|
closed
|
allow image / voice upload on heroku
|
bug priority-2
|
maybe it is because currently if CORS is disabled and we cannot upload images -> maybe use express to enable it if in production?
|
1.0
|
allow image / voice upload on heroku - maybe it is because currently if CORS is disabled and we cannot upload images -> maybe use express to enable it if in production?
|
non_process
|
allow image voice upload on heroku maybe it is because currently if cors is disabled and we cannot upload images maybe use express to enable it if in production
| 0
|
37,570
| 18,534,906,004
|
IssuesEvent
|
2021-10-21 10:22:39
|
bbc/simorgh
|
https://api.github.com/repos/bbc/simorgh
|
closed
|
Some Ramda methods are being destructured
|
bug performance
|
**Describe the bug**
Destructuring imports from Ramda does not necessarily prevent importing the entire library. We should manually cherry-pick methods like the following, which would only grab the parts necessary for identity to work:
```
import identity from 'ramda/src/identity'
```
and not destructure like :
```
import { identity } from 'ramda'
```
**To Reproduce**
1. Search the entire Simorgh project for `from 'ramda';`
2. Observe files where methods are not being cherry-picked.
<img width="349" alt="Screenshot 2021-10-21 at 09 31 00" src="https://user-images.githubusercontent.com/4798332/138241700-a7104845-2d4b-4742-bb48-9fc5b167d983.png">
**Expected behaviour**
Refactoring all Ramda imports to cherry-pick methods instead of destructuring.
**Alternatives**
Manually cherry picking methods is cumbersome, however. Most bundlers like Webpack offer tree-shaking as a way to drop unused Ramda code and reduce bundle size.
Webpack + Babel - use babel-plugin-ramda to automatically cherry pick methods.
**Testing notes**
- [ ] This bug fix is expected to need manual testing.
**Checklist**
- [ ] (BBC contributors only) This issue follows the [repository use guidelines](https://github.com/bbc/simorgh-infrastructure/blob/latest/documentation/repository-guidelines.md)
**Additional context**
Add any other context about the problem here.
|
True
|
Some Ramda methods are being destructured - **Describe the bug**
Destructuring imports from Ramda does not necessarily prevent importing the entire library. We should manually cherry-pick methods like the following, which would only grab the parts necessary for identity to work:
```
import identity from 'ramda/src/identity'
```
and not destructure like :
```
import { identity } from 'ramda'
```
**To Reproduce**
1. Search the entire Simorgh project for `from 'ramda';`
2. Observe files where methods are not being cherry-picked.
<img width="349" alt="Screenshot 2021-10-21 at 09 31 00" src="https://user-images.githubusercontent.com/4798332/138241700-a7104845-2d4b-4742-bb48-9fc5b167d983.png">
**Expected behaviour**
Refactoring all Ramda imports to cherry-pick methods instead of destructuring.
**Alternatives**
Manually cherry picking methods is cumbersome, however. Most bundlers like Webpack offer tree-shaking as a way to drop unused Ramda code and reduce bundle size.
Webpack + Babel - use babel-plugin-ramda to automatically cherry pick methods.
**Testing notes**
- [ ] This bug fix is expected to need manual testing.
**Checklist**
- [ ] (BBC contributors only) This issue follows the [repository use guidelines](https://github.com/bbc/simorgh-infrastructure/blob/latest/documentation/repository-guidelines.md)
**Additional context**
Add any other context about the problem here.
|
non_process
|
some ramda methods are being destructured describe the bug destructuring imports from ramda does not necessarily prevent importing the entire library we should manually cherry pick methods like the following which would only grab the parts necessary for identity to work import identity from ramda src identity and not destructure like import identity from ramda to reproduce search the entire simorgh project for from ramda observe files where methods are not being cherry picked img width alt screenshot at src expected behaviour refactoring all ramda imports to cherry pick methods instead of destructuring alternatives manually cherry picking methods is cumbersome however most bundlers like webpack offer tree shaking as a way to drop unused ramda code and reduce bundle size webpack babel use babel plugin ramda to automatically cherry pick methods testing notes this bug fix is expected to need manual testing checklist bbc contributors only this issue follows the additional context add any other context about the problem here
| 0
|
1,199
| 3,698,569,628
|
IssuesEvent
|
2016-02-28 12:25:25
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
reopened
|
Child Process send incorrectly serializes values
|
child_process
|
# Code
index.js
```
const childProcess = require('child_process').fork(__dirname + '/worker.js');
childProcess.send('get-infinity');
childProcess.on('message', (valuePair) => {
console.log(valuePair);
});
```
worker.js
```
process.on('message', () => {
process.send(['Infinity', Infinity]);
process.send(['NaN', NaN]);
});
```
## Expected Result
```
['Infinity', Infinity]
['NaN', NaN]
```
## Actual Result
```
['Infinity', null]
['NaN', null]
```
* **Version**: _output of `node -v`
v5.6.0
* **Platform**: _either `uname -a` output, or if Windows, version and 32-bit or
64-bit
Linux tido-tchaikovsky 3.19.0-51-generic #57~14.04.1-Ubuntu SMP Fri Feb 19 14:36:55 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
* **Subsystem**: _optional. if known - please specify affected core module name_
child_process
|
1.0
|
Child Process send incorrectly serializes values - # Code
index.js
```
const childProcess = require('child_process').fork(__dirname + '/worker.js');
childProcess.send('get-infinity');
childProcess.on('message', (valuePair) => {
console.log(valuePair);
});
```
worker.js
```
process.on('message', () => {
process.send(['Infinity', Infinity]);
process.send(['NaN', NaN]);
});
```
## Expected Result
```
['Infinity', Infinity]
['NaN', NaN]
```
## Actual Result
```
['Infinity', null]
['NaN', null]
```
* **Version**: _output of `node -v`
v5.6.0
* **Platform**: _either `uname -a` output, or if Windows, version and 32-bit or
64-bit
Linux tido-tchaikovsky 3.19.0-51-generic #57~14.04.1-Ubuntu SMP Fri Feb 19 14:36:55 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
* **Subsystem**: _optional. if known - please specify affected core module name_
child_process
|
process
|
child process send incorrectly serializes values code index js const childprocess require child process fork dirname worker js childprocess send get infinity childprocess on message valuepair console log valuepair worker js process on message process send process send expected result actual result version output of node v platform either uname a output or if windows version and bit or bit linux tido tchaikovsky generic ubuntu smp fri feb utc gnu linux subsystem optional if known please specify affected core module name child process
| 1
|
20,424
| 3,354,754,226
|
IssuesEvent
|
2015-11-18 13:49:46
|
car2go/openAPI
|
https://api.github.com/repos/car2go/openAPI
|
closed
|
Get bookings or create bookings returns NULL with test=1
|
auto-migrated Priority-Medium Type-Defect
|
```
Stepts to reproduce:
1. Get a valid access_token
2. Perform a GET request call
https://www.car2go.com/api/v2.1/bookings?format=json&loc=Hamburg&oaut....&test=1
Expected output would be at least an operation succesfull 0 with an empty array
of bookings, that I can see without the test=1. With test=1 I don't get any
answer, it is just NULL. If I use an invalid loation or omit the loc parameter,
I get an "Location invalid" with test=1.
The same also occurs when I try to create a booking. I can get the two test
accounts and when I try to create a booking with a valid VIN and the accountId,
the result is just empty.
```
Original issue reported on code.google.com by `piffp...@knacken.net` on 13 May 2012 at 8:46
|
1.0
|
Get bookings or create bookings returns NULL with test=1 - ```
Stepts to reproduce:
1. Get a valid access_token
2. Perform a GET request call
https://www.car2go.com/api/v2.1/bookings?format=json&loc=Hamburg&oaut....&test=1
Expected output would be at least an operation succesfull 0 with an empty array
of bookings, that I can see without the test=1. With test=1 I don't get any
answer, it is just NULL. If I use an invalid loation or omit the loc parameter,
I get an "Location invalid" with test=1.
The same also occurs when I try to create a booking. I can get the two test
accounts and when I try to create a booking with a valid VIN and the accountId,
the result is just empty.
```
Original issue reported on code.google.com by `piffp...@knacken.net` on 13 May 2012 at 8:46
|
non_process
|
get bookings or create bookings returns null with test stepts to reproduce get a valid access token perform a get request call expected output would be at least an operation succesfull with an empty array of bookings that i can see without the test with test i don t get any answer it is just null if i use an invalid loation or omit the loc parameter i get an location invalid with test the same also occurs when i try to create a booking i can get the two test accounts and when i try to create a booking with a valid vin and the accountid the result is just empty original issue reported on code google com by piffp knacken net on may at
| 0
|
15,090
| 18,798,726,530
|
IssuesEvent
|
2021-11-09 03:13:32
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Graphic Modeler Export to Python for Matrix Parameter Giving Incorrect Syntax
|
Processing Bug Modeller
|
### What is the bug or the crash?
If your graphic model has a Matrix Parameter and you export it to python you get incorrect code for Matrix Parameter. See screenshot below

On export to python, it is giving this.
```python
self.addParameter(QgsProcessingParameterMatrix('outputs', 'Outputs', numberRows=, hasFixedNumberRows=, headers=['Name','Output? [Y/N]'], defaultValue=[Runoff,Y,Lead,N,Nitrogen,N,Phosphorus,N,Zinc,N,TSS,N]))
```
If you run the code with the above line it will give you an error. The fix that worked for me is given below
```python
self.addParameter(QgsProcessingParameterMatrix('outputs', 'Outputs', headers=['Name','Output? [Y/N]'], defaultValue=['Runoff','Y','Lead','N','Nitrogen','N','Phosphorus','N','Zinc','N','TSS','N']))
```
### Steps to reproduce the issue
Create a graphic model with matrix input.
Export model to python.
Run python script.
### Versions
Checked on 3.16 and 3.22.
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
Graphic Modeler Export to Python for Matrix Parameter Giving Incorrect Syntax - ### What is the bug or the crash?
If your graphic model has a Matrix Parameter and you export it to python you get incorrect code for Matrix Parameter. See screenshot below

On export to python, it is giving this.
```python
self.addParameter(QgsProcessingParameterMatrix('outputs', 'Outputs', numberRows=, hasFixedNumberRows=, headers=['Name','Output? [Y/N]'], defaultValue=[Runoff,Y,Lead,N,Nitrogen,N,Phosphorus,N,Zinc,N,TSS,N]))
```
If you run the code with the above line it will give you an error. The fix that worked for me is given below
```python
self.addParameter(QgsProcessingParameterMatrix('outputs', 'Outputs', headers=['Name','Output? [Y/N]'], defaultValue=['Runoff','Y','Lead','N','Nitrogen','N','Phosphorus','N','Zinc','N','TSS','N']))
```
### Steps to reproduce the issue
Create a graphic model with matrix input.
Export model to python.
Run python script.
### Versions
Checked on 3.16 and 3.22.
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
graphic modeler export to python for matrix parameter giving incorrect syntax what is the bug or the crash if your graphic model has a matrix parameter and you export it to python you get incorrect code for matrix parameter see screenshot below on export to python it is giving this python self addparameter qgsprocessingparametermatrix outputs outputs numberrows hasfixednumberrows headers defaultvalue if you run the code with the above line it will give you an error the fix that worked for me is given below python self addparameter qgsprocessingparametermatrix outputs outputs headers defaultvalue steps to reproduce the issue create a graphic model with matrix input export model to python run python script versions checked on and supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
20,055
| 26,542,477,565
|
IssuesEvent
|
2023-01-19 20:30:09
|
spinalcordtoolbox/spinalcordtoolbox
|
https://api.github.com/repos/spinalcordtoolbox/spinalcordtoolbox
|
closed
|
Add a method to compute the spinal cord volume
|
sct_process_segmentation priority:LOW feature good first issue user requested
|
Several users are asking for a feature to compute spinal cord volume. We could easily implement that in `sct_process_segmentation`.
User posts:
- http://forum.spinalcordmri.org/t/t1-contrast-brain-image-error-output/395/4
- http://forum.spinalcordmri.org/t/get-spinal-cord-volume/152
|
1.0
|
Add a method to compute the spinal cord volume - Several users are asking for a feature to compute spinal cord volume. We could easily implement that in `sct_process_segmentation`.
User posts:
- http://forum.spinalcordmri.org/t/t1-contrast-brain-image-error-output/395/4
- http://forum.spinalcordmri.org/t/get-spinal-cord-volume/152
|
process
|
add a method to compute the spinal cord volume several users are asking for a feature to compute spinal cord volume we could easily implement that in sct process segmentation user posts
| 1
|
6,347
| 9,392,731,588
|
IssuesEvent
|
2019-04-07 04:15:54
|
P0cL4bs/WiFi-Pumpkin
|
https://api.github.com/repos/P0cL4bs/WiFi-Pumpkin
|
closed
|
captive portal with wifi-pumpkin
|
Feature request in process new version
|
## What's the problem (or question)?
I know this is quite a large question (or multiple) but I would really appreciate if somebody could take the time and help me with them:
I want to make a wireless access point with wifi-pumpkin, that, in the beginning, redirects the user to a startup page (captive portal) which is running on a local server. The captive portal forces the user to enter their email address and to accept the terms of use.
I found two good tutorials on the web
(https://projectzme.wordpress.com/2012/03/19/captive-portal-using-php-and-iptables-firewall-on-linux/ and
http://www.andybev.com/index.php/Using_iptables_and_PHP_to_create_a_captive_portal (both are very simmilar)) on how to do this but I ran into a bunch of problems.
The idea is fairly simple: i want to redirect every unknown users http traffic to my local server.
when the user agrees to my terms of use and entered his email, his traffic won't be redirected anymore...
This is what I have accomplished so far:
I started a local webserver with the 'phishing manager' from wifi-pumpkin
than i added following iptables to wifi-pumpkin:
- `iptables -t mangle -N internet`
- `iptables -t mangle -A PREROUTING -i wlan0 -p tcp -m tcp --dport 80 -j internet`
- `iptables -t mangle -A internet -j MARK --set-mark 99`
- `iptables -t nat -A PREROUTING -i wlan0 -p tcp -m mark --mark 99 -m tcp --dport 80 -j DNAT --to-destination 192.168.3.1`
then i started the ap with wifi-pumpkin
now, every user gets redirected to the my captive portal (i tested it and it works).
So... here are (finally) my questions:
1. How is the correlation between the 'normal' iptables of my operating system and those of wifi-pumpkin? I don't know if this is a silly question but i am asking because when i edit those of my os, the iptables from wifi-pumpkin don't change, so i need to change them in the wifi-pumpkin settings.
2. Every user that that has submitted their email and agreed to the terms of use, should not be redirected to my local server. This means that I somehow need to remove _mark 99_ (which everyone gets when connecting to my ap (because of my iptables from above)) for this user.
The tutorial makes this with the following comands (after they get the mac address of the user)
`sudo iptables -D internet -t mangle -m mac --mac-source USER_MAC_ADDRESS -j RETURN`
`sudo rmtrack USER_IP_ADDRESS`
I don't really know how to implement to capturing of the mac-address and also how to implement the two commands from above in wifi-pumpkin.
I would really appreciate if somebody could take the time and help me with my problem;)
#### Please tell us details about your environment.
* Card wireless adapters name (please check if support AP/mode): CSL 300Mbit Wlan Adatpter (supports AP-mode)
* Version used tool: v0.8.5
* Virtual Machine (yes or no and which): Kali Linux v2018.1 running on vmware
* Operating System and version: Linux Mint v18.3
|
1.0
|
captive portal with wifi-pumpkin - ## What's the problem (or question)?
I know this is quite a large question (or multiple) but I would really appreciate if somebody could take the time and help me with them:
I want to make a wireless access point with wifi-pumpkin, that, in the beginning, redirects the user to a startup page (captive portal) which is running on a local server. The captive portal forces the user to enter their email address and to accept the terms of use.
I found two good tutorials on the web
(https://projectzme.wordpress.com/2012/03/19/captive-portal-using-php-and-iptables-firewall-on-linux/ and
http://www.andybev.com/index.php/Using_iptables_and_PHP_to_create_a_captive_portal (both are very simmilar)) on how to do this but I ran into a bunch of problems.
The idea is fairly simple: i want to redirect every unknown users http traffic to my local server.
when the user agrees to my terms of use and entered his email, his traffic won't be redirected anymore...
This is what I have accomplished so far:
I started a local webserver with the 'phishing manager' from wifi-pumpkin
than i added following iptables to wifi-pumpkin:
- `iptables -t mangle -N internet`
- `iptables -t mangle -A PREROUTING -i wlan0 -p tcp -m tcp --dport 80 -j internet`
- `iptables -t mangle -A internet -j MARK --set-mark 99`
- `iptables -t nat -A PREROUTING -i wlan0 -p tcp -m mark --mark 99 -m tcp --dport 80 -j DNAT --to-destination 192.168.3.1`
then i started the ap with wifi-pumpkin
now, every user gets redirected to the my captive portal (i tested it and it works).
So... here are (finally) my questions:
1. How is the correlation between the 'normal' iptables of my operating system and those of wifi-pumpkin? I don't know if this is a silly question but i am asking because when i edit those of my os, the iptables from wifi-pumpkin don't change, so i need to change them in the wifi-pumpkin settings.
2. Every user that that has submitted their email and agreed to the terms of use, should not be redirected to my local server. This means that I somehow need to remove _mark 99_ (which everyone gets when connecting to my ap (because of my iptables from above)) for this user.
The tutorial makes this with the following comands (after they get the mac address of the user)
`sudo iptables -D internet -t mangle -m mac --mac-source USER_MAC_ADDRESS -j RETURN`
`sudo rmtrack USER_IP_ADDRESS`
I don't really know how to implement to capturing of the mac-address and also how to implement the two commands from above in wifi-pumpkin.
I would really appreciate if somebody could take the time and help me with my problem;)
#### Please tell us details about your environment.
* Card wireless adapters name (please check if support AP/mode): CSL 300Mbit Wlan Adatpter (supports AP-mode)
* Version used tool: v0.8.5
* Virtual Machine (yes or no and which): Kali Linux v2018.1 running on vmware
* Operating System and version: Linux Mint v18.3
|
process
|
captive portal with wifi pumpkin what s the problem or question i know this is quite a large question or multiple but i would really appreciate if somebody could take the time and help me with them i want to make a wireless access point with wifi pumpkin that in the beginning redirects the user to a startup page captive portal which is running on a local server the captive portal forces the user to enter their email address and to accept the terms of use i found two good tutorials on the web and both are very simmilar on how to do this but i ran into a bunch of problems the idea is fairly simple i want to redirect every unknown users http traffic to my local server when the user agrees to my terms of use and entered his email his traffic won t be redirected anymore this is what i have accomplished so far i started a local webserver with the phishing manager from wifi pumpkin than i added following iptables to wifi pumpkin iptables t mangle n internet iptables t mangle a prerouting i p tcp m tcp dport j internet iptables t mangle a internet j mark set mark iptables t nat a prerouting i p tcp m mark mark m tcp dport j dnat to destination then i started the ap with wifi pumpkin now every user gets redirected to the my captive portal i tested it and it works so here are finally my questions how is the correlation between the normal iptables of my operating system and those of wifi pumpkin i don t know if this is a silly question but i am asking because when i edit those of my os the iptables from wifi pumpkin don t change so i need to change them in the wifi pumpkin settings every user that that has submitted their email and agreed to the terms of use should not be redirected to my local server this means that i somehow need to remove mark which everyone gets when connecting to my ap because of my iptables from above for this user the tutorial makes this with the following comands after they get the mac address of the user sudo iptables d internet t mangle m mac mac source user mac address j return sudo rmtrack user ip address i don t really know how to implement to capturing of the mac address and also how to implement the two commands from above in wifi pumpkin i would really appreciate if somebody could take the time and help me with my problem please tell us details about your environment card wireless adapters name please check if support ap mode csl wlan adatpter supports ap mode version used tool virtual machine yes or no and which kali linux running on vmware operating system and version linux mint
| 1
|
1,048
| 3,518,229,953
|
IssuesEvent
|
2016-01-12 11:48:29
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Unicode Issue in `child_process.execSync`
|
child_process
|
In `execSync`, if the input contains huge amounts of Unicode characters, the output will probably contain garbled characters. This bug exists in Node.js v5.3.0.
Steps to reproduce:
First, create an "abc.coffee" file with the following content:
```coffee
a = """
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
"""
```
Then, in Node.js console, type the following:
```javascript
input=fs.readFileSync("abc.coffee",{encoding:"utf8"})
```
And:
```javascript
child_process.execSync("coffee -bcs",{encoding:"utf8",input:input})
```
You can find the return value contains some garbled characters "���" (though it's very rare). I think this problem isn't in CoffeeScript because if you use:
```bash
cat abc.coffee | coffee -bcs
```
Then the output is OK.
|
1.0
|
Unicode Issue in `child_process.execSync` - In `execSync`, if the input contains huge amounts of Unicode characters, the output will probably contain garbled characters. This bug exists in Node.js v5.3.0.
Steps to reproduce:
First, create an "abc.coffee" file with the following content:
```coffee
a = """
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊
"""
```
Then, in Node.js console, type the following:
```javascript
input=fs.readFileSync("abc.coffee",{encoding:"utf8"})
```
And:
```javascript
child_process.execSync("coffee -bcs",{encoding:"utf8",input:input})
```
You can find the return value contains some garbled characters "���" (though it's very rare). I think this problem isn't in CoffeeScript because if you use:
```bash
cat abc.coffee | coffee -bcs
```
Then the output is OK.
|
process
|
unicode issue in child process execsync in execsync if the input contains huge amounts of unicode characters the output will probably contain garbled characters this bug exists in node js steps to reproduce first create an abc coffee file with the following content coffee a 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊 then in node js console type the following javascript input fs readfilesync abc coffee encoding and javascript child process execsync coffee bcs encoding input input you can find the return value contains some garbled characters ��� though it s very rare i think this problem isn t in coffeescript because if you use bash cat abc coffee coffee bcs then the output is ok
| 1
|
19,457
| 25,744,659,631
|
IssuesEvent
|
2022-12-08 09:03:36
|
googleapis/google-cloud-php-resource-manager
|
https://api.github.com/repos/googleapis/google-cloud-php-resource-manager
|
closed
|
Your .repo-metadata.json file has a problem 🤒
|
type: process api: cloudresourcemanager repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* client_documentation must match pattern "^https://.*" in .repo-metadata.json
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname field missing from .repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* client_documentation must match pattern "^https://.*" in .repo-metadata.json
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname field missing from .repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 client documentation must match pattern in repo metadata json release level must be equal to one of the allowed values in repo metadata json api shortname field missing from repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
| 1
|
13,616
| 16,195,547,697
|
IssuesEvent
|
2021-05-04 14:10:44
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Response Server] iOS Active tasks > Tower of Hanoi and Spatial span memory > Response for 'Value' field is displayed as an array post submitting response
|
Bug P1 Process: Fixed Process: Tested QA Response datastore
|
A/R: Value parameter is shown as an array in response server
Eg. for Tower of Hanoi active task
`{"resultType":"boolean","startTime":"2020-11-06T20:51:50.750+0530","key":"puzzleWasSolved","endTime":"2020-11-06T20:52:18.119+0530","value":true,"skipped":false},{"skipped":false,"resultType":"numeric","value":15,"key":"numberOfMoves","endTime":"2020-11-06T20:52:18.119+0530","startTime":"2020-11-06T20:51:50.750+0530"}`
Expected: Value parameter should show appropriate data as per active tasks requirement

|
2.0
|
[Response Server] iOS Active tasks > Tower of Hanoi and Spatial span memory > Response for 'Value' field is displayed as an array post submitting response - A/R: Value parameter is shown as an array in response server
Eg. for Tower of Hanoi active task
`{"resultType":"boolean","startTime":"2020-11-06T20:51:50.750+0530","key":"puzzleWasSolved","endTime":"2020-11-06T20:52:18.119+0530","value":true,"skipped":false},{"skipped":false,"resultType":"numeric","value":15,"key":"numberOfMoves","endTime":"2020-11-06T20:52:18.119+0530","startTime":"2020-11-06T20:51:50.750+0530"}`
Expected: Value parameter should show appropriate data as per active tasks requirement

|
process
|
ios active tasks tower of hanoi and spatial span memory response for value field is displayed as an array post submitting response a r value parameter is shown as an array in response server eg for tower of hanoi active task resulttype boolean starttime key puzzlewassolved endtime value true skipped false skipped false resulttype numeric value key numberofmoves endtime starttime expected value parameter should show appropriate data as per active tasks requirement
| 1
|
504,625
| 14,620,373,091
|
IssuesEvent
|
2020-12-22 19:36:05
|
googleapis/elixir-google-api
|
https://api.github.com/repos/googleapis/elixir-google-api
|
closed
|
Synthesis failed for DigitalAssetLinks
|
autosynth failure priority: p1 type: bug
|
Hello! Autosynth couldn't regenerate DigitalAssetLinks. :broken_heart:
Here's the output from running `synth.py`:
```
2020-12-18 05:52:10,188 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/elixir-google-api
2020-12-18 05:52:11,112 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-12-18 05:52:11,115 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-12-18 05:52:11,123 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-12-18 05:52:11,126 autosynth [DEBUG] > Running: git config push.default simple
2020-12-18 05:52:11,129 autosynth [DEBUG] > Running: git branch -f autosynth-digitalassetlinks
2020-12-18 05:52:11,133 autosynth [DEBUG] > Running: git checkout autosynth-digitalassetlinks
Switched to branch 'autosynth-digitalassetlinks'
2020-12-18 05:52:11,383 autosynth [INFO] > Running synthtool
2020-12-18 05:52:11,383 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/digital_asset_links/synth.metadata', 'synth.py', '--']
2020-12-18 05:52:11,383 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/elixir-google-api/DigitalAssetLinks/sponge_log.log
2020-12-18 05:52:11,386 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/digital_asset_links/synth.metadata synth.py -- DigitalAssetLinks
2020-12-18 05:52:11,604 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/elixir-google-api/synth.py.
On branch autosynth-digitalassetlinks
nothing to commit, working tree clean
2020-12-18 05:52:13,341 synthtool [DEBUG] > Running: docker run --rm -v/tmpfs/tmp/tmpf1q05nfi/repo:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh DigitalAssetLinks
DEBUG:synthtool:Running: docker run --rm -v/tmpfs/tmp/tmpf1q05nfi/repo:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh DigitalAssetLinks
/workspace /workspace
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
Resolving Hex dependencies...
Dependency resolution completed:
Unchanged:
certifi 2.5.1
google_api_discovery 0.7.0
google_gax 0.3.2
hackney 1.15.2
idna 6.0.0
jason 1.2.1
metrics 1.0.1
mime 1.3.1
mimerl 1.2.0
oauth2 0.9.4
parse_trans 3.3.0
poison 3.1.0
ssl_verify_fun 1.1.5
temp 0.4.7
tesla 1.3.3
unicode_util_compat 0.4.1
* Getting google_api_discovery (Hex package)
* Getting tesla (Hex package)
* Getting oauth2 (Hex package)
* Getting temp (Hex package)
* Getting jason (Hex package)
* Getting poison (Hex package)
* Getting hackney (Hex package)
* Getting certifi (Hex package)
* Getting idna (Hex package)
* Getting metrics (Hex package)
* Getting mimerl (Hex package)
* Getting ssl_verify_fun (Hex package)
* Getting unicode_util_compat (Hex package)
* Getting parse_trans (Hex package)
* Getting mime (Hex package)
* Getting google_gax (Hex package)
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
==> temp
Compiling 3 files (.ex)
Generated temp app
===> Compiling parse_trans
===> Compiling mimerl
===> Compiling metrics
===> Compiling unicode_util_compat
===> Compiling idna
==> jason
Compiling 8 files (.ex)
Generated jason app
warning: String.strip/1 is deprecated. Use String.trim/1 instead
/workspace/deps/poison/mix.exs:4
==> poison
Compiling 4 files (.ex)
warning: Integer.to_char_list/2 is deprecated. Use Integer.to_charlist/2 instead
lib/poison/encoder.ex:173
Generated poison app
==> ssl_verify_fun
Compiling 7 files (.erl)
Generated ssl_verify_fun app
===> Compiling certifi
===> Compiling hackney
==> oauth2
Compiling 13 files (.ex)
Generated oauth2 app
==> mime
Compiling 2 files (.ex)
Generated mime app
==> tesla
Compiling 26 files (.ex)
Generated tesla app
==> google_gax
Compiling 5 files (.ex)
Generated google_gax app
==> google_api_discovery
Compiling 21 files (.ex)
Generated google_api_discovery app
==> google_apis
Compiling 28 files (.ex)
Generated google_apis app
13:52:43.963 [info] FETCHING: https://digitalassetlinks.googleapis.com/$discovery/GOOGLE_REST_SIMPLE_URI?version=v1
13:52:44.120 [info] FETCHING: https://digitalassetlinks.googleapis.com/$discovery/rest?version=v1
13:52:44.519 [info] FOUND: https://digitalassetlinks.googleapis.com/$discovery/rest?version=v1
Revision check: old=20200829, new=20201207, generating=true
Creating leading directories
Writing AndroidAppAsset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/android_app_asset.ex.
Writing Asset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/asset.ex.
Writing CertificateInfo to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/certificate_info.ex.
Writing CheckResponse to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/check_response.ex.
Writing ListResponse to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/list_response.ex.
Writing Statement to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/statement.ex.
Writing WebAsset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/web_asset.ex.
Writing Assetlinks to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/api/assetlinks.ex.
Writing Statements to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/api/statements.ex.
Writing connection.ex.
Writing metadata.ex.
Writing mix.exs
Writing README.md
Writing LICENSE
Writing .gitignore
Writing config/config.exs
Writing test/test_helper.exs
13:52:44.942 [info] Found only discovery_revision and/or formatting changes. Not significant enough for a PR.
fixing file permissions
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "/tmpfs/src/github/synthtool/synthtool/metadata.py", line 252, in __exit__
self.observer.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 361, in on_thread_stop
self.unschedule_all()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 357, in unschedule_all
self._clear_emitters()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 231, in _clear_emitters
emitter.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_stop
self._inotify.close()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 50, in close
self.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 46, in on_thread_stop
self._inotify.close()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 277, in close
os.close(self._inotify_fd)
OSError: [Errno 9] Bad file descriptor
2020-12-18 05:52:47,952 autosynth [ERROR] > Synthesis failed
2020-12-18 05:52:47,952 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 291, in _inner_main
).synthesize(synth_log_path / "sponge_log.log")
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/digital_asset_links/synth.metadata', 'synth.py', '--', 'DigitalAssetLinks']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/3a455424-7540-46a9-bb77-8265c8f04c06/targets/github%2Fsynthtool;config=default/tests;query=elixir-google-api;failed=false).
|
1.0
|
Synthesis failed for DigitalAssetLinks - Hello! Autosynth couldn't regenerate DigitalAssetLinks. :broken_heart:
Here's the output from running `synth.py`:
```
2020-12-18 05:52:10,188 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/elixir-google-api
2020-12-18 05:52:11,112 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-12-18 05:52:11,115 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-12-18 05:52:11,123 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-12-18 05:52:11,126 autosynth [DEBUG] > Running: git config push.default simple
2020-12-18 05:52:11,129 autosynth [DEBUG] > Running: git branch -f autosynth-digitalassetlinks
2020-12-18 05:52:11,133 autosynth [DEBUG] > Running: git checkout autosynth-digitalassetlinks
Switched to branch 'autosynth-digitalassetlinks'
2020-12-18 05:52:11,383 autosynth [INFO] > Running synthtool
2020-12-18 05:52:11,383 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/digital_asset_links/synth.metadata', 'synth.py', '--']
2020-12-18 05:52:11,383 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/elixir-google-api/DigitalAssetLinks/sponge_log.log
2020-12-18 05:52:11,386 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/digital_asset_links/synth.metadata synth.py -- DigitalAssetLinks
2020-12-18 05:52:11,604 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/elixir-google-api/synth.py.
On branch autosynth-digitalassetlinks
nothing to commit, working tree clean
2020-12-18 05:52:13,341 synthtool [DEBUG] > Running: docker run --rm -v/tmpfs/tmp/tmpf1q05nfi/repo:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh DigitalAssetLinks
DEBUG:synthtool:Running: docker run --rm -v/tmpfs/tmp/tmpf1q05nfi/repo:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh DigitalAssetLinks
/workspace /workspace
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
Resolving Hex dependencies...
Dependency resolution completed:
Unchanged:
certifi 2.5.1
google_api_discovery 0.7.0
google_gax 0.3.2
hackney 1.15.2
idna 6.0.0
jason 1.2.1
metrics 1.0.1
mime 1.3.1
mimerl 1.2.0
oauth2 0.9.4
parse_trans 3.3.0
poison 3.1.0
ssl_verify_fun 1.1.5
temp 0.4.7
tesla 1.3.3
unicode_util_compat 0.4.1
* Getting google_api_discovery (Hex package)
* Getting tesla (Hex package)
* Getting oauth2 (Hex package)
* Getting temp (Hex package)
* Getting jason (Hex package)
* Getting poison (Hex package)
* Getting hackney (Hex package)
* Getting certifi (Hex package)
* Getting idna (Hex package)
* Getting metrics (Hex package)
* Getting mimerl (Hex package)
* Getting ssl_verify_fun (Hex package)
* Getting unicode_util_compat (Hex package)
* Getting parse_trans (Hex package)
* Getting mime (Hex package)
* Getting google_gax (Hex package)
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
==> temp
Compiling 3 files (.ex)
Generated temp app
===> Compiling parse_trans
===> Compiling mimerl
===> Compiling metrics
===> Compiling unicode_util_compat
===> Compiling idna
==> jason
Compiling 8 files (.ex)
Generated jason app
warning: String.strip/1 is deprecated. Use String.trim/1 instead
/workspace/deps/poison/mix.exs:4
==> poison
Compiling 4 files (.ex)
warning: Integer.to_char_list/2 is deprecated. Use Integer.to_charlist/2 instead
lib/poison/encoder.ex:173
Generated poison app
==> ssl_verify_fun
Compiling 7 files (.erl)
Generated ssl_verify_fun app
===> Compiling certifi
===> Compiling hackney
==> oauth2
Compiling 13 files (.ex)
Generated oauth2 app
==> mime
Compiling 2 files (.ex)
Generated mime app
==> tesla
Compiling 26 files (.ex)
Generated tesla app
==> google_gax
Compiling 5 files (.ex)
Generated google_gax app
==> google_api_discovery
Compiling 21 files (.ex)
Generated google_api_discovery app
==> google_apis
Compiling 28 files (.ex)
Generated google_apis app
13:52:43.963 [info] FETCHING: https://digitalassetlinks.googleapis.com/$discovery/GOOGLE_REST_SIMPLE_URI?version=v1
13:52:44.120 [info] FETCHING: https://digitalassetlinks.googleapis.com/$discovery/rest?version=v1
13:52:44.519 [info] FOUND: https://digitalassetlinks.googleapis.com/$discovery/rest?version=v1
Revision check: old=20200829, new=20201207, generating=true
Creating leading directories
Writing AndroidAppAsset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/android_app_asset.ex.
Writing Asset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/asset.ex.
Writing CertificateInfo to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/certificate_info.ex.
Writing CheckResponse to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/check_response.ex.
Writing ListResponse to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/list_response.ex.
Writing Statement to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/statement.ex.
Writing WebAsset to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/model/web_asset.ex.
Writing Assetlinks to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/api/assetlinks.ex.
Writing Statements to clients/digital_asset_links/lib/google_api/digital_asset_links/v1/api/statements.ex.
Writing connection.ex.
Writing metadata.ex.
Writing mix.exs
Writing README.md
Writing LICENSE
Writing .gitignore
Writing config/config.exs
Writing test/test_helper.exs
13:52:44.942 [info] Found only discovery_revision and/or formatting changes. Not significant enough for a PR.
fixing file permissions
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "/tmpfs/src/github/synthtool/synthtool/metadata.py", line 252, in __exit__
self.observer.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 361, in on_thread_stop
self.unschedule_all()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 357, in unschedule_all
self._clear_emitters()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 231, in _clear_emitters
emitter.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_stop
self._inotify.close()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 50, in close
self.stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 81, in stop
self.on_thread_stop()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 46, in on_thread_stop
self._inotify.close()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 277, in close
os.close(self._inotify_fd)
OSError: [Errno 9] Bad file descriptor
2020-12-18 05:52:47,952 autosynth [ERROR] > Synthesis failed
2020-12-18 05:52:47,952 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 291, in _inner_main
).synthesize(synth_log_path / "sponge_log.log")
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/digital_asset_links/synth.metadata', 'synth.py', '--', 'DigitalAssetLinks']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/3a455424-7540-46a9-bb77-8265c8f04c06/targets/github%2Fsynthtool;config=default/tests;query=elixir-google-api;failed=false).
|
non_process
|
synthesis failed for digitalassetlinks hello autosynth couldn t regenerate digitalassetlinks broken heart here s the output from running synth py autosynth logs will be written to tmpfs src logs elixir google api autosynth running git config global core excludesfile home kbuilder autosynth gitignore autosynth running git config user name yoshi automation autosynth running git config user email yoshi automation google com autosynth running git config push default simple autosynth running git branch f autosynth digitalassetlinks autosynth running git checkout autosynth digitalassetlinks switched to branch autosynth digitalassetlinks autosynth running synthtool autosynth autosynth log file path tmpfs src logs elixir google api digitalassetlinks sponge log log autosynth running tmpfs src github synthtool env bin m synthtool metadata clients digital asset links synth metadata synth py digitalassetlinks synthtool executing home kbuilder cache synthtool elixir google api synth py on branch autosynth digitalassetlinks nothing to commit working tree clean synthtool running docker run rm v tmpfs tmp repo workspace v var run docker sock var run docker sock e user group w workspace gcr io cloud devrel public resources scripts generate client sh digitalassetlinks debug synthtool running docker run rm v tmpfs tmp repo workspace v var run docker sock var run docker sock e user group w workspace gcr io cloud devrel public resources scripts generate client sh digitalassetlinks workspace workspace mix lock file was generated with a newer version of hex update your client by running mix local hex to avoid losing data resolving hex dependencies dependency resolution completed unchanged certifi google api discovery google gax hackney idna jason metrics mime mimerl parse trans poison ssl verify fun temp tesla unicode util compat getting google api discovery hex package getting tesla hex package getting hex package getting temp hex package getting jason hex package getting poison hex package getting hackney hex package getting certifi hex package getting idna hex package getting metrics hex package getting mimerl hex package getting ssl verify fun hex package getting unicode util compat hex package getting parse trans hex package getting mime hex package getting google gax hex package mix lock file was generated with a newer version of hex update your client by running mix local hex to avoid losing data temp compiling files ex generated temp app compiling parse trans compiling mimerl compiling metrics compiling unicode util compat compiling idna jason compiling files ex generated jason app warning string strip is deprecated use string trim instead workspace deps poison mix exs poison compiling files ex warning integer to char list is deprecated use integer to charlist instead lib poison encoder ex generated poison app ssl verify fun compiling files erl generated ssl verify fun app compiling certifi compiling hackney compiling files ex generated app mime compiling files ex generated mime app tesla compiling files ex generated tesla app google gax compiling files ex generated google gax app google api discovery compiling files ex generated google api discovery app google apis compiling files ex generated google apis app fetching fetching found revision check old new generating true creating leading directories writing androidappasset to clients digital asset links lib google api digital asset links model android app asset ex writing asset to clients digital asset links lib google api digital asset links model asset ex writing certificateinfo to clients digital asset links lib google api digital asset links model certificate info ex writing checkresponse to clients digital asset links lib google api digital asset links model check response ex writing listresponse to clients digital asset links lib google api digital asset links model list response ex writing statement to clients digital asset links lib google api digital asset links model statement ex writing webasset to clients digital asset links lib google api digital asset links model web asset ex writing assetlinks to clients digital asset links lib google api digital asset links api assetlinks ex writing statements to clients digital asset links lib google api digital asset links api statements ex writing connection ex writing metadata ex writing mix exs writing readme md writing license writing gitignore writing config config exs writing test test helper exs found only discovery revision and or formatting changes not significant enough for a pr fixing file permissions traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool synthtool main py line in main file tmpfs src github synthtool env lib site packages click core py line in call return self main args kwargs file tmpfs src github synthtool env lib site packages click core py line in main rv self invoke ctx file tmpfs src github synthtool env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src github synthtool env lib site packages click core py line in invoke return callback args kwargs file tmpfs src github synthtool synthtool main py line in main spec loader exec module synth module type ignore file tmpfs src github synthtool synthtool metadata py line in exit self observer stop file tmpfs src github synthtool env lib site packages watchdog utils init py line in stop self on thread stop file tmpfs src github synthtool env lib site packages watchdog observers api py line in on thread stop self unschedule all file tmpfs src github synthtool env lib site packages watchdog observers api py line in unschedule all self clear emitters file tmpfs src github synthtool env lib site packages watchdog observers api py line in clear emitters emitter stop file tmpfs src github synthtool env lib site packages watchdog utils init py line in stop self on thread stop file tmpfs src github synthtool env lib site packages watchdog observers inotify py line in on thread stop self inotify close file tmpfs src github synthtool env lib site packages watchdog observers inotify buffer py line in close self stop file tmpfs src github synthtool env lib site packages watchdog utils init py line in stop self on thread stop file tmpfs src github synthtool env lib site packages watchdog observers inotify buffer py line in on thread stop self inotify close file tmpfs src github synthtool env lib site packages watchdog observers inotify c py line in close os close self inotify fd oserror bad file descriptor autosynth synthesis failed autosynth running git clean fdx removing pycache traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main synthesize synth log path sponge log log file tmpfs src github synthtool autosynth synthesizer py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log
| 0
|
739,857
| 25,724,866,212
|
IssuesEvent
|
2022-12-07 15:59:55
|
inverse-inc/packetfence
|
https://api.github.com/repos/inverse-inc/packetfence
|
closed
|
Switch without Switch Group?
|
Type: Bug Priority: Medium
|
**Describe the bug**
When you create a switch you need to choose a switch group
After that point, you can remove that switch group but the switch looks still linked to this group.
In the config the "group=" is not here.
**To Reproduce**
Steps to reproduce the behavior:
1. Create the switch
2. Renove the group
3. Show in switch group or in the tab switch, the switch is still linked to the group
**Expected behavior**
Goup empty or Remove the option to have a switch without group
|
1.0
|
Switch without Switch Group? - **Describe the bug**
When you create a switch you need to choose a switch group
After that point, you can remove that switch group but the switch looks still linked to this group.
In the config the "group=" is not here.
**To Reproduce**
Steps to reproduce the behavior:
1. Create the switch
2. Renove the group
3. Show in switch group or in the tab switch, the switch is still linked to the group
**Expected behavior**
Goup empty or Remove the option to have a switch without group
|
non_process
|
switch without switch group describe the bug when you create a switch you need to choose a switch group after that point you can remove that switch group but the switch looks still linked to this group in the config the group is not here to reproduce steps to reproduce the behavior create the switch renove the group show in switch group or in the tab switch the switch is still linked to the group expected behavior goup empty or remove the option to have a switch without group
| 0
|
12,224
| 14,743,226,145
|
IssuesEvent
|
2021-01-07 13:35:50
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Weekly Account Balance to Invoice Validation | parent: 1523
|
anc-process anp-0.5 ant-enhancement grt-server
|
In GitLab by @tim.traylor on Aug 5, 2019, 06:01
Please create a job that will run weekly and that will check the account balance vs invoice balances as was done for #1538.
This script should send a notification to criticalerrors@sahosted.com if any accounts are out of balance. The job itself should also send an error if it cannot complete and the OS should send an error if the cron job can't run.
|
1.0
|
Weekly Account Balance to Invoice Validation | parent: 1523 - In GitLab by @tim.traylor on Aug 5, 2019, 06:01
Please create a job that will run weekly and that will check the account balance vs invoice balances as was done for #1538.
This script should send a notification to criticalerrors@sahosted.com if any accounts are out of balance. The job itself should also send an error if it cannot complete and the OS should send an error if the cron job can't run.
|
process
|
weekly account balance to invoice validation parent in gitlab by tim traylor on aug please create a job that will run weekly and that will check the account balance vs invoice balances as was done for this script should send a notification to criticalerrors sahosted com if any accounts are out of balance the job itself should also send an error if it cannot complete and the os should send an error if the cron job can t run
| 1
|
559
| 3,020,809,952
|
IssuesEvent
|
2015-07-31 10:31:52
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
opened
|
Defer reading listing images to end of preprocessing
|
feature P2 preprocess
|
In the current preprocessing design, list of images is generated during gen-list target. However, this relies on listing image formats in a configuration file and not relying on `<image>` element references, since key defined images are not processes at that point. An alternative design would be to create a list of images during image-metadata target where image dimensions are read. This would allow getting an accurate list of images that doesn't rely on `@format` or file extensions.
|
1.0
|
Defer reading listing images to end of preprocessing - In the current preprocessing design, list of images is generated during gen-list target. However, this relies on listing image formats in a configuration file and not relying on `<image>` element references, since key defined images are not processes at that point. An alternative design would be to create a list of images during image-metadata target where image dimensions are read. This would allow getting an accurate list of images that doesn't rely on `@format` or file extensions.
|
process
|
defer reading listing images to end of preprocessing in the current preprocessing design list of images is generated during gen list target however this relies on listing image formats in a configuration file and not relying on element references since key defined images are not processes at that point an alternative design would be to create a list of images during image metadata target where image dimensions are read this would allow getting an accurate list of images that doesn t rely on format or file extensions
| 1
|
11,459
| 14,284,085,755
|
IssuesEvent
|
2020-11-23 11:59:21
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
Adobe Launch Analytics requests are not fired when accessing a page with TestCafe
|
AREA: client FREQUENCY: level 1 SYSTEM: client side processing TYPE: bug
|
### What is your Scenario?
If accessing the test website with the TestCafe the Adobe Launch Analytics requests are not displayed.
If accessing the page manually in browser, Adobe Launch Analytics requests are displayed.
### Steps to Reproduce:
- For TestCafe - Run the attached code
- While the page is loading open the Developer Tools and filter the requests by **b/ss** and observe that there are no requests
- Manually: it cannot be reproduced.
### What is the Current behavior?
- When accessing the website the TestCafe no Adobe Launch Analytics requests are fired.
- Also one or more JavaScript errors are displayed in Console: ***Cannot read property 'indexOf' of undefined***
- Note that these JavaScript errors are not displayed when visiting the page manually.

### What is the Expected behavior?
- When visiting the page with TestCafe Adobe Launch Analytics can be fired:
- No JavaScript errors are displayed after accepting the cookies
### What is your public web site URL?
https://www.newegg.com/
<summary>Your complete test code (or attach your test files):</summary>
```
import {Selector} from "testcafe";
fixture`Analytics`
.beforeEach(async t => {
await t.maximizeWindow();
})
test('Adobe Launch Analytics', async t => {
await t.navigateTo("https://www.newegg.com/");
await t.click(Selector("#onetrust-pc-btn-handler"));
await t.click(Selector('div:nth-child(3) label span.ot-switch-nob').filterVisible());
await t.click(Selector('div:nth-child(4) label span.ot-switch-nob').filterVisible());
await t.click(Selector('div:nth-child(5) label span.ot-switch-nob').filterVisible());
await t.click('.save-preference-btn-handler.onetrust-close-btn-handler');
await t.debug();
})
```
</details>
<details>
<summary>Your complete configuration file (if any):</summary>
<!-- Paste your complete test config file here (even if it is huge): -->
```
{
"screenshotPath": "screenshots",
"browsers": "chrome",
"screenshots": {
"takeOnFails": true,
"fullPage": true
},
"debugOnFail": false,
"skipJsErrors": true,
"disablePageCaching": false,
"color": true,
"speed": 0.90,
"pageLoadTimeout": 30000,
"assertionTimeout": 30000,
"selectorTimeout": 30000,
"concurrency": 1
}
```
</details>
### Your Environment details:
* testcafe version: 1.9.4
* node.js version: v11.2.0
command-line arguments: no additional parameters
* platform and version: Chrome - Version 85.0.4183.121 (Official Build) (64-bit)
* other: win 10
|
1.0
|
Adobe Launch Analytics requests are not fired when accessing a page with TestCafe - ### What is your Scenario?
If accessing the test website with the TestCafe the Adobe Launch Analytics requests are not displayed.
If accessing the page manually in browser, Adobe Launch Analytics requests are displayed.
### Steps to Reproduce:
- For TestCafe - Run the attached code
- While the page is loading open the Developer Tools and filter the requests by **b/ss** and observe that there are no requests
- Manually: it cannot be reproduced.
### What is the Current behavior?
- When accessing the website the TestCafe no Adobe Launch Analytics requests are fired.
- Also one or more JavaScript errors are displayed in Console: ***Cannot read property 'indexOf' of undefined***
- Note that these JavaScript errors are not displayed when visiting the page manually.

### What is the Expected behavior?
- When visiting the page with TestCafe Adobe Launch Analytics can be fired:
- No JavaScript errors are displayed after accepting the cookies
### What is your public web site URL?
https://www.newegg.com/
<summary>Your complete test code (or attach your test files):</summary>
```
import {Selector} from "testcafe";
fixture`Analytics`
.beforeEach(async t => {
await t.maximizeWindow();
})
test('Adobe Launch Analytics', async t => {
await t.navigateTo("https://www.newegg.com/");
await t.click(Selector("#onetrust-pc-btn-handler"));
await t.click(Selector('div:nth-child(3) label span.ot-switch-nob').filterVisible());
await t.click(Selector('div:nth-child(4) label span.ot-switch-nob').filterVisible());
await t.click(Selector('div:nth-child(5) label span.ot-switch-nob').filterVisible());
await t.click('.save-preference-btn-handler.onetrust-close-btn-handler');
await t.debug();
})
```
</details>
<details>
<summary>Your complete configuration file (if any):</summary>
<!-- Paste your complete test config file here (even if it is huge): -->
```
{
"screenshotPath": "screenshots",
"browsers": "chrome",
"screenshots": {
"takeOnFails": true,
"fullPage": true
},
"debugOnFail": false,
"skipJsErrors": true,
"disablePageCaching": false,
"color": true,
"speed": 0.90,
"pageLoadTimeout": 30000,
"assertionTimeout": 30000,
"selectorTimeout": 30000,
"concurrency": 1
}
```
</details>
### Your Environment details:
* testcafe version: 1.9.4
* node.js version: v11.2.0
command-line arguments: no additional parameters
* platform and version: Chrome - Version 85.0.4183.121 (Official Build) (64-bit)
* other: win 10
|
process
|
adobe launch analytics requests are not fired when accessing a page with testcafe what is your scenario if accessing the test website with the testcafe the adobe launch analytics requests are not displayed if accessing the page manually in browser adobe launch analytics requests are displayed steps to reproduce for testcafe run the attached code while the page is loading open the developer tools and filter the requests by b ss and observe that there are no requests manually it cannot be reproduced what is the current behavior when accessing the website the testcafe no adobe launch analytics requests are fired also one or more javascript errors are displayed in console cannot read property indexof of undefined note that these javascript errors are not displayed when visiting the page manually what is the expected behavior when visiting the page with testcafe adobe launch analytics can be fired no javascript errors are displayed after accepting the cookies what is your public web site url your complete test code or attach your test files import selector from testcafe fixture analytics beforeeach async t await t maximizewindow test adobe launch analytics async t await t navigateto await t click selector onetrust pc btn handler await t click selector div nth child label span ot switch nob filtervisible await t click selector div nth child label span ot switch nob filtervisible await t click selector div nth child label span ot switch nob filtervisible await t click save preference btn handler onetrust close btn handler await t debug your complete configuration file if any screenshotpath screenshots browsers chrome screenshots takeonfails true fullpage true debugonfail false skipjserrors true disablepagecaching false color true speed pageloadtimeout assertiontimeout selectortimeout concurrency your environment details testcafe version node js version command line arguments no additional parameters platform and version chrome version official build bit other win
| 1
|
78,891
| 3,518,857,208
|
IssuesEvent
|
2016-01-12 14:47:43
|
chef/chef
|
https://api.github.com/repos/chef/chef
|
closed
|
chef mangles umlauts in attributes
|
Bug Medium Priority
|
We found out, when we save strings in chef attributes using "knife edit" or the ruby API "rbvmomi" german umlauts (äüöß) get mangled. I'm not sure if this occures during the saving of the attribute or the loading. The string we saved was UTF-8 encoded. The whole system is configured to use UTF-8. We also could reproduce the error on multiple systems.
Steps to reproduce:
- knife node edit <somenode>
- put an umlaut somewhere in the nodes normal attributes
- save
- knife node edit <somenode>
chef-server: chef-server-core-12.3.1-1.el6.x86_64 running on centos 6.7 (system configured UTF-8)
chef-client: chef-12.5.1-1.el6.x86_64 running on centos 6.7 or sles 11.4 (system configured UTF-8)
Thank you in advance!
Kind regards,
Lothar
|
1.0
|
chef mangles umlauts in attributes - We found out, when we save strings in chef attributes using "knife edit" or the ruby API "rbvmomi" german umlauts (äüöß) get mangled. I'm not sure if this occures during the saving of the attribute or the loading. The string we saved was UTF-8 encoded. The whole system is configured to use UTF-8. We also could reproduce the error on multiple systems.
Steps to reproduce:
- knife node edit <somenode>
- put an umlaut somewhere in the nodes normal attributes
- save
- knife node edit <somenode>
chef-server: chef-server-core-12.3.1-1.el6.x86_64 running on centos 6.7 (system configured UTF-8)
chef-client: chef-12.5.1-1.el6.x86_64 running on centos 6.7 or sles 11.4 (system configured UTF-8)
Thank you in advance!
Kind regards,
Lothar
|
non_process
|
chef mangles umlauts in attributes we found out when we save strings in chef attributes using knife edit or the ruby api rbvmomi german umlauts äüöß get mangled i m not sure if this occures during the saving of the attribute or the loading the string we saved was utf encoded the whole system is configured to use utf we also could reproduce the error on multiple systems steps to reproduce knife node edit put an umlaut somewhere in the nodes normal attributes save knife node edit chef server chef server core running on centos system configured utf chef client chef running on centos or sles system configured utf thank you in advance kind regards lothar
| 0
|
12,173
| 14,741,890,600
|
IssuesEvent
|
2021-01-07 11:20:27
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Delete payment of account if there's no finalized invoiced yet
|
anc-process anp-3 ant-bug
|
In GitLab by @pchaudhary on Feb 26, 2019, 06:46
We found a scenario while some internal testing that there's an error while deleting the payment if there's no finalized invoice on the account yet.
|
1.0
|
Delete payment of account if there's no finalized invoiced yet - In GitLab by @pchaudhary on Feb 26, 2019, 06:46
We found a scenario while some internal testing that there's an error while deleting the payment if there's no finalized invoice on the account yet.
|
process
|
delete payment of account if there s no finalized invoiced yet in gitlab by pchaudhary on feb we found a scenario while some internal testing that there s an error while deleting the payment if there s no finalized invoice on the account yet
| 1
|
4,546
| 7,375,309,925
|
IssuesEvent
|
2018-03-13 23:45:09
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Misspelling of addEdges
|
cosmos-db cxp doc-bug in-process triaged
|
Search for “addEges”, update to “addEdges”.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 0374b015-1139-9df8-6c96-a8657a088748
* Version Independent ID: 2e18d44d-de3c-eee5-7932-0a5a967e9b17
* Content: [Azure Cosmos DB Gremlin support | Microsoft Docs](https://docs.microsoft.com/en-us/azure/cosmos-db/gremlin-support)
* Content Source: [articles/cosmos-db/gremlin-support.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cosmos-db/gremlin-support.md)
* Service: **cosmos-db**
* GitHub Login: @LuisBosquez
* Microsoft Alias: **lbosq**
|
1.0
|
Misspelling of addEdges - Search for “addEges”, update to “addEdges”.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 0374b015-1139-9df8-6c96-a8657a088748
* Version Independent ID: 2e18d44d-de3c-eee5-7932-0a5a967e9b17
* Content: [Azure Cosmos DB Gremlin support | Microsoft Docs](https://docs.microsoft.com/en-us/azure/cosmos-db/gremlin-support)
* Content Source: [articles/cosmos-db/gremlin-support.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cosmos-db/gremlin-support.md)
* Service: **cosmos-db**
* GitHub Login: @LuisBosquez
* Microsoft Alias: **lbosq**
|
process
|
misspelling of addedges search for “addeges” update to “addedges” document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cosmos db github login luisbosquez microsoft alias lbosq
| 1
|
411,440
| 12,018,064,593
|
IssuesEvent
|
2020-04-10 19:53:57
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.who.int - Random question highlighted when scrolling down the page
|
browser-fenix engine-gecko form-v2-experiment priority-normal severity-minor type-covid19
|
<!-- @browser: Firefox Preview Nightly 200323 (🦎: 76.0a1-20200320095353) -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:76.0) Gecko/20100101 Firefox/76.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: form-v2-experiment -->
**URL**: https://www.who.int/news-room/q-a-detail/q-a-coronaviruses
**Browser / Version**: Firefox Preview Nightly 200323 (🦎: 76.0a1-20200320095353)
**Operating System**: Huawei P20 Lite (Android 8.0.0) - 1080 x 2280 pixels, 19:9 ratio (~432 ppi density)
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: Random question highlighted when scrolling down the page
**Steps to Reproduce**:
1. Navigate to https://www.who.int/news-room/q-a-detail/q-a-coronaviruses
2. Tap to select a question.
3. Scroll down the page and observe behavior.
**Expected Behavior:**
The selected question remains highlighted.
**Actual Behavior:**
A random question is highlighted.
**Notes:**
1. Screenshot attached.
2. The issue is not reproducible on Chrome 80.0.3987.149.
3. The issue is also reproducible when expanding a question and scrolling down the page. The highlight on the expanded question jumps to the next question.
Watchers:
@softvision-oana-arbuzov
@softvision-sergiulogigan
@cipriansv

<details><summary>View the screenshot</summary><img alt='Screenshot' src='https://webcompat.com/uploads/2020/3/674a242c-da0e-42e6-aba3-d9121764c8e7.jpg'></details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.who.int - Random question highlighted when scrolling down the page - <!-- @browser: Firefox Preview Nightly 200323 (🦎: 76.0a1-20200320095353) -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:76.0) Gecko/20100101 Firefox/76.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: form-v2-experiment -->
**URL**: https://www.who.int/news-room/q-a-detail/q-a-coronaviruses
**Browser / Version**: Firefox Preview Nightly 200323 (🦎: 76.0a1-20200320095353)
**Operating System**: Huawei P20 Lite (Android 8.0.0) - 1080 x 2280 pixels, 19:9 ratio (~432 ppi density)
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: Random question highlighted when scrolling down the page
**Steps to Reproduce**:
1. Navigate to https://www.who.int/news-room/q-a-detail/q-a-coronaviruses
2. Tap to select a question.
3. Scroll down the page and observe behavior.
**Expected Behavior:**
The selected question remains highlighted.
**Actual Behavior:**
A random question is highlighted.
**Notes:**
1. Screenshot attached.
2. The issue is not reproducible on Chrome 80.0.3987.149.
3. The issue is also reproducible when expanding a question and scrolling down the page. The highlight on the expanded question jumps to the next question.
Watchers:
@softvision-oana-arbuzov
@softvision-sergiulogigan
@cipriansv

<details><summary>View the screenshot</summary><img alt='Screenshot' src='https://webcompat.com/uploads/2020/3/674a242c-da0e-42e6-aba3-d9121764c8e7.jpg'></details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
random question highlighted when scrolling down the page url browser version firefox preview nightly 🦎 operating system huawei lite android x pixels ratio ppi density tested another browser yes chrome problem type something else description random question highlighted when scrolling down the page steps to reproduce navigate to tap to select a question scroll down the page and observe behavior expected behavior the selected question remains highlighted actual behavior a random question is highlighted notes screenshot attached the issue is not reproducible on chrome the issue is also reproducible when expanding a question and scrolling down the page the highlight on the expanded question jumps to the next question watchers softvision oana arbuzov softvision sergiulogigan cipriansv view the screenshot img alt screenshot src browser configuration none from with ❤️
| 0
|
211,941
| 23,856,833,442
|
IssuesEvent
|
2022-09-07 01:08:00
|
sdellenb/play-sound-mplayer
|
https://api.github.com/repos/sdellenb/play-sound-mplayer
|
opened
|
WS-2021-0638 (High) detected in mocha-6.0.2.tgz
|
security vulnerability
|
## WS-2021-0638 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-6.0.2.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-6.0.2.tgz">https://registry.npmjs.org/mocha/-/mocha-6.0.2.tgz</a></p>
<p>Path to dependency file: /play-sound-mplayer/package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- :x: **mocha-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is regular Expression Denial of Service (ReDoS) vulnerability in mocha.
It allows cause a denial of service when stripping crafted invalid function definition from strs.
<p>Publish Date: 2021-09-18
<p>URL: <a href=https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea>WS-2021-0638</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/1d8a3d95-d199-4129-a6ad-8eafe5e77b9e/">https://huntr.dev/bounties/1d8a3d95-d199-4129-a6ad-8eafe5e77b9e/</a></p>
<p>Release Date: 2021-09-18</p>
<p>Fix Resolution: https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2021-0638 (High) detected in mocha-6.0.2.tgz - ## WS-2021-0638 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-6.0.2.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-6.0.2.tgz">https://registry.npmjs.org/mocha/-/mocha-6.0.2.tgz</a></p>
<p>Path to dependency file: /play-sound-mplayer/package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- :x: **mocha-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is regular Expression Denial of Service (ReDoS) vulnerability in mocha.
It allows cause a denial of service when stripping crafted invalid function definition from strs.
<p>Publish Date: 2021-09-18
<p>URL: <a href=https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea>WS-2021-0638</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/1d8a3d95-d199-4129-a6ad-8eafe5e77b9e/">https://huntr.dev/bounties/1d8a3d95-d199-4129-a6ad-8eafe5e77b9e/</a></p>
<p>Release Date: 2021-09-18</p>
<p>Fix Resolution: https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in mocha tgz ws high severity vulnerability vulnerable library mocha tgz simple flexible fun test framework library home page a href path to dependency file play sound mplayer package json path to vulnerable library node modules mocha package json dependency hierarchy x mocha tgz vulnerable library vulnerability details there is regular expression denial of service redos vulnerability in mocha it allows cause a denial of service when stripping crafted invalid function definition from strs publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
20,774
| 27,505,860,396
|
IssuesEvent
|
2023-03-06 03:19:12
|
AssetRipper/AssetRipper
|
https://api.github.com/repos/AssetRipper/AssetRipper
|
closed
|
`PlayerSettings.m_AllowUnsafeCode` should be true
|
enhancement scripts processing
|
### Describe the new feature or enhancement
As discussed in #633, `PlayerSettings.m_AllowUnsafeCode` should be true. This will require some changes to source generation.
After the changes to source generation are complete, all that remains is a special case for `PlayerSettings` in `EditorFormatProcessor`.
|
1.0
|
`PlayerSettings.m_AllowUnsafeCode` should be true - ### Describe the new feature or enhancement
As discussed in #633, `PlayerSettings.m_AllowUnsafeCode` should be true. This will require some changes to source generation.
After the changes to source generation are complete, all that remains is a special case for `PlayerSettings` in `EditorFormatProcessor`.
|
process
|
playersettings m allowunsafecode should be true describe the new feature or enhancement as discussed in playersettings m allowunsafecode should be true this will require some changes to source generation after the changes to source generation are complete all that remains is a special case for playersettings in editorformatprocessor
| 1
|
187,165
| 22,038,466,315
|
IssuesEvent
|
2022-05-29 00:52:50
|
switcherapi/switcher-api
|
https://api.github.com/repos/switcherapi/switcher-api
|
closed
|
Security issue: Vulnerabilities found on Dockerfile - node:16.13-alpine3.15
|
security patch
|
Ref: https://app.snyk.io/org/petruki/project/564fc144-4094-407b-8024-ffdd6ae001c5
**Critical**:
https://security.snyk.io/vuln/SNYK-ALPINE315-BUSYBOX-2440607
Vulnerability in busybox/ssl_client 1.34.1-r3. No remediation available yet.
**High**
https://security.snyk.io/vuln/SNYK-ALPINE315-ZLIB-2434420
Vulnerability in zlib/zlib 1.2.11-r3. No remediation available yet.
**High**
https://security.snyk.io/vuln/SNYK-ALPINE315-OPENSSL-2426331
Vulnerability in openssl/libssl1.1 1.1.1l-r7. No remediation available yet.
|
True
|
Security issue: Vulnerabilities found on Dockerfile - node:16.13-alpine3.15 - Ref: https://app.snyk.io/org/petruki/project/564fc144-4094-407b-8024-ffdd6ae001c5
**Critical**:
https://security.snyk.io/vuln/SNYK-ALPINE315-BUSYBOX-2440607
Vulnerability in busybox/ssl_client 1.34.1-r3. No remediation available yet.
**High**
https://security.snyk.io/vuln/SNYK-ALPINE315-ZLIB-2434420
Vulnerability in zlib/zlib 1.2.11-r3. No remediation available yet.
**High**
https://security.snyk.io/vuln/SNYK-ALPINE315-OPENSSL-2426331
Vulnerability in openssl/libssl1.1 1.1.1l-r7. No remediation available yet.
|
non_process
|
security issue vulnerabilities found on dockerfile node ref critical vulnerability in busybox ssl client no remediation available yet high vulnerability in zlib zlib no remediation available yet high vulnerability in openssl no remediation available yet
| 0
|
216,649
| 24,287,250,324
|
IssuesEvent
|
2022-09-29 00:09:55
|
samq-democorp/Umbraco-CMS
|
https://api.github.com/repos/samq-democorp/Umbraco-CMS
|
opened
|
microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg: 5 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p></summary>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-8292](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8292) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Net.Http - 4.3.4;Microsoft.PowerShell.Commands.Utility - 6.1.0-rc.1 | ✅ |
| [CVE-2017-0247](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0247) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.5.4;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3 | ✅ |
| [CVE-2017-0248](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0248) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1, 4.3.1;System.Net.Http - 4.1.2, 4.3.2;System.Net.Http.WinHttpHandler - 4.0.2, 4.3.1;System.Net.Security - 4.0.1, 4.3.1;System.Net.WebSockets.Client - 4.0.1, 4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4, 1.1.3 | ✅ |
| [CVE-2017-0249](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3 | ✅ |
| [CVE-2017-0256](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | detected in multiple dependencies | Direct | Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-8292</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An information disclosure vulnerability exists in .NET Core when authentication information is inadvertently exposed in a redirect, aka ".NET Core Information Disclosure Vulnerability." This affects .NET Core 2.1, .NET Core 1.0, .NET Core 1.1, PowerShell Core 6.0.
<p>Publish Date: 2018-10-10
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8292>CVE-2018-8292</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-10-10</p>
<p>Fix Resolution: System.Net.Http - 4.3.4;Microsoft.PowerShell.Commands.Utility - 6.1.0-rc.1</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0247</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A denial of service vulnerability exists when the ASP.NET Core fails to properly validate web requests. NOTE: Microsoft has not commented on third-party claims that the issue is that the TextEncoder.EncodeCore function in the System.Text.Encodings.Web package in ASP.NET Core Mvc before 1.0.4 and 1.1.x before 1.1.3 allows remote attackers to cause a denial of service by leveraging failure to properly calculate the length of 4-byte characters in the Unicode Non-Character range.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0247>CVE-2017-0247</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.5.4;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0248</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Microsoft .NET Framework 2.0, 3.5, 3.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2 and 4.7 allow an attacker to bypass Enhanced Security Usage taggings when they present a certificate that is invalid for a specific use, aka ".NET Security Feature Bypass Vulnerability."
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0248>CVE-2017-0248</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1, 4.3.1;System.Net.Http - 4.1.2, 4.3.2;System.Net.Http.WinHttpHandler - 4.0.2, 4.3.1;System.Net.Security - 4.0.1, 4.3.1;System.Net.WebSockets.Client - 4.0.1, 4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4, 1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0249</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An elevation of privilege vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249>CVE-2017-0249</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2017-0256</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A spoofing vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256>CVE-2017-0256</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-0256">https://nvd.nist.gov/vuln/detail/CVE-2017-0256</a></p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg: 5 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p></summary>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-8292](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8292) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Net.Http - 4.3.4;Microsoft.PowerShell.Commands.Utility - 6.1.0-rc.1 | ✅ |
| [CVE-2017-0247](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0247) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.5.4;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3 | ✅ |
| [CVE-2017-0248](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0248) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1, 4.3.1;System.Net.Http - 4.1.2, 4.3.2;System.Net.Http.WinHttpHandler - 4.0.2, 4.3.1;System.Net.Security - 4.0.1, 4.3.1;System.Net.WebSockets.Client - 4.0.1, 4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4, 1.1.3 | ✅ |
| [CVE-2017-0249](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | detected in multiple dependencies | Direct | System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3 | ✅ |
| [CVE-2017-0256](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | detected in multiple dependencies | Direct | Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-8292</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An information disclosure vulnerability exists in .NET Core when authentication information is inadvertently exposed in a redirect, aka ".NET Core Information Disclosure Vulnerability." This affects .NET Core 2.1, .NET Core 1.0, .NET Core 1.1, PowerShell Core 6.0.
<p>Publish Date: 2018-10-10
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8292>CVE-2018-8292</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-10-10</p>
<p>Fix Resolution: System.Net.Http - 4.3.4;Microsoft.PowerShell.Commands.Utility - 6.1.0-rc.1</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0247</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A denial of service vulnerability exists when the ASP.NET Core fails to properly validate web requests. NOTE: Microsoft has not commented on third-party claims that the issue is that the TextEncoder.EncodeCore function in the System.Text.Encodings.Web package in ASP.NET Core Mvc before 1.0.4 and 1.1.x before 1.1.3 allows remote attackers to cause a denial of service by leveraging failure to properly calculate the length of 4-byte characters in the Unicode Non-Character range.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0247>CVE-2017-0247</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.5.4;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0248</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Microsoft .NET Framework 2.0, 3.5, 3.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2 and 4.7 allow an attacker to bypass Enhanced Security Usage taggings when they present a certificate that is invalid for a specific use, aka ".NET Security Feature Bypass Vulnerability."
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0248>CVE-2017-0248</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1, 4.3.1;System.Net.Http - 4.1.2, 4.3.2;System.Net.Http.WinHttpHandler - 4.0.2, 4.3.1;System.Net.Security - 4.0.1, 4.3.1;System.Net.WebSockets.Client - 4.0.1, 4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4, 1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4, 1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-0249</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An elevation of privilege vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249>CVE-2017-0249</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2017-0256</summary>
### Vulnerable Libraries - <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b>, <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>Path to dependency file: /src/Umbraco.Tests.Benchmarks/Umbraco.Tests.Benchmarks.csproj</p>
<p>Path to vulnerable library: /t/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg,/home/wss-scanner/.nuget/packages/microsoft.codedom.providers.dotnetcompilerplatform/2.0.1/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
### <b>microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</b></p>
<p>Replacement CodeDOM providers that use the new .NET Compiler Platform ("Roslyn") compiler as a servi...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg">https://api.nuget.org/packages/microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg</a></p>
<p>
Dependency Hierarchy:
- :x: **microsoft.codedom.providers.dotnetcompilerplatform.2.0.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-democorp/Umbraco-CMS/commit/07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080">07d00f9f09c53bd7fd2cc157f7b57dbcbbc93080</a></p>
<p>Found in base branch: <b>v8/dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A spoofing vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256>CVE-2017-0256</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-0256">https://nvd.nist.gov/vuln/detail/CVE-2017-0256</a></p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
microsoft codedom providers dotnetcompilerplatform nupkg vulnerabilities highest severity is vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high detected in multiple dependencies direct system net http microsoft powershell commands utility rc high detected in multiple dependencies direct system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim high detected in multiple dependencies direct system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim high detected in multiple dependencies direct system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim medium detected in multiple dependencies direct microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc core microsoft aspnetcore mvc cors microsoft aspnetcore mvc localization system net http microsoft aspnetcore mvc razor system net http winhttphandler system net security microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc taghelpers system text encodings web microsoft aspnetcore mvc razor host microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc webapicompatshim system net websockets client microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc dataannotations details cve vulnerable libraries microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details an information disclosure vulnerability exists in net core when authentication information is inadvertently exposed in a redirect aka net core information disclosure vulnerability this affects net core net core net core powershell core publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution system net http microsoft powershell commands utility rc rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details a denial of service vulnerability exists when the asp net core fails to properly validate web requests note microsoft has not commented on third party claims that the issue is that the textencoder encodecore function in the system text encodings web package in asp net core mvc before and x before allows remote attackers to cause a denial of service by leveraging failure to properly calculate the length of byte characters in the unicode non character range publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details microsoft net framework and allow an attacker to bypass enhanced security usage taggings when they present a certificate that is invalid for a specific use aka net security feature bypass vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details an elevation of privilege vulnerability exists when the asp net core fails to properly sanitize web requests publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim rescue worker helmet automatic remediation is available for this issue cve vulnerable libraries microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href path to dependency file src umbraco tests benchmarks umbraco tests benchmarks csproj path to vulnerable library t packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg home wss scanner nuget packages microsoft codedom providers dotnetcompilerplatform microsoft codedom providers dotnetcompilerplatform nupkg dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library microsoft codedom providers dotnetcompilerplatform nupkg replacement codedom providers that use the new net compiler platform roslyn compiler as a servi library home page a href dependency hierarchy x microsoft codedom providers dotnetcompilerplatform nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details a spoofing vulnerability exists when the asp net core fails to properly sanitize web requests publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc core microsoft aspnetcore mvc cors microsoft aspnetcore mvc localization system net http microsoft aspnetcore mvc razor system net http winhttphandler system net security microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc taghelpers system text encodings web microsoft aspnetcore mvc razor host microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc webapicompatshim system net websockets client microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc dataannotations rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
376,345
| 26,196,387,523
|
IssuesEvent
|
2023-01-03 13:50:45
|
weaveworks/weave-gitops
|
https://api.github.com/repos/weaveworks/weave-gitops
|
closed
|
TF controller install docs should default to GitOps with Flux
|
documentation team/denim
|
https://docs.gitops.weave.works/docs/installation/weave-gitops-enterprise/#optional-install-the-tf-controller
Suggests manually adding through Helm CLI
https://docs.gitops.weave.works/docs/terraform/get-started/ does talk about installing via flux, but again by directly applying the flux manifests to the cluster via `kubectl apply -f https://raw.githubusercontent.com/weaveworks/tf-controller/main/docs/release.yaml`
We should likely mirror more of a GitOps based deployment similar to WGE itself https://docs.gitops.weave.works/docs/installation/weave-gitops-enterprise/#5-configure-and-commit
|
1.0
|
TF controller install docs should default to GitOps with Flux - https://docs.gitops.weave.works/docs/installation/weave-gitops-enterprise/#optional-install-the-tf-controller
Suggests manually adding through Helm CLI
https://docs.gitops.weave.works/docs/terraform/get-started/ does talk about installing via flux, but again by directly applying the flux manifests to the cluster via `kubectl apply -f https://raw.githubusercontent.com/weaveworks/tf-controller/main/docs/release.yaml`
We should likely mirror more of a GitOps based deployment similar to WGE itself https://docs.gitops.weave.works/docs/installation/weave-gitops-enterprise/#5-configure-and-commit
|
non_process
|
tf controller install docs should default to gitops with flux suggests manually adding through helm cli does talk about installing via flux but again by directly applying the flux manifests to the cluster via kubectl apply f we should likely mirror more of a gitops based deployment similar to wge itself
| 0
|
86,102
| 3,702,670,695
|
IssuesEvent
|
2016-02-29 17:37:13
|
dmusican/Elegit
|
https://api.github.com/repos/dmusican/Elegit
|
closed
|
Label HEAD and branches in commit graphs
|
enhancement priority high
|
It would be really helpful to have refs (branches, commits, tags, HEAD, etc) in the actual graph. The challenge is to keep this from getting too noisy, but I think some way of doing this is pretty important. I just did a fast forward merge, and the graph didn't change. Seeing some update would be really helpful.
Specifics: the refs could be:
- in the nodes themselves: the nodes would have to get bigger
- near the nodes: they'd have to point. This would get messy, but could work
- in a row at the top or bottom of the graph. It's already the case that only a single commit appears in any given column, so this could perhaps be easiest?
|
1.0
|
Label HEAD and branches in commit graphs - It would be really helpful to have refs (branches, commits, tags, HEAD, etc) in the actual graph. The challenge is to keep this from getting too noisy, but I think some way of doing this is pretty important. I just did a fast forward merge, and the graph didn't change. Seeing some update would be really helpful.
Specifics: the refs could be:
- in the nodes themselves: the nodes would have to get bigger
- near the nodes: they'd have to point. This would get messy, but could work
- in a row at the top or bottom of the graph. It's already the case that only a single commit appears in any given column, so this could perhaps be easiest?
|
non_process
|
label head and branches in commit graphs it would be really helpful to have refs branches commits tags head etc in the actual graph the challenge is to keep this from getting too noisy but i think some way of doing this is pretty important i just did a fast forward merge and the graph didn t change seeing some update would be really helpful specifics the refs could be in the nodes themselves the nodes would have to get bigger near the nodes they d have to point this would get messy but could work in a row at the top or bottom of the graph it s already the case that only a single commit appears in any given column so this could perhaps be easiest
| 0
|
1,225
| 3,756,985,105
|
IssuesEvent
|
2016-03-13 18:18:23
|
pwittchen/gesture
|
https://api.github.com/repos/pwittchen/gesture
|
closed
|
prepare library for release
|
release process
|
update `gradle.properties` file, set initial library version and add upload script for Maven Central repo.
|
1.0
|
prepare library for release - update `gradle.properties` file, set initial library version and add upload script for Maven Central repo.
|
process
|
prepare library for release update gradle properties file set initial library version and add upload script for maven central repo
| 1
|
3,597
| 6,626,244,953
|
IssuesEvent
|
2017-09-22 18:43:02
|
Azure/azure-event-hubs-java
|
https://api.github.com/repos/Azure/azure-event-hubs-java
|
closed
|
Move logging framework from logger to slf4j
|
enhancement EventProcessorHost
|
_From @SreeramGarlapati on May 16, 2016 17:1_
One open Item here is - if the log4j appender has to push events to EventHubs using the JavaClient - how to ignore the eventhub events (am sure there should be a way)
_Copied from original issue: Azure/azure-event-hubs#148_
|
1.0
|
Move logging framework from logger to slf4j - _From @SreeramGarlapati on May 16, 2016 17:1_
One open Item here is - if the log4j appender has to push events to EventHubs using the JavaClient - how to ignore the eventhub events (am sure there should be a way)
_Copied from original issue: Azure/azure-event-hubs#148_
|
process
|
move logging framework from logger to from sreeramgarlapati on may one open item here is if the appender has to push events to eventhubs using the javaclient how to ignore the eventhub events am sure there should be a way copied from original issue azure azure event hubs
| 1
|
40,832
| 12,799,745,706
|
IssuesEvent
|
2020-07-02 15:52:39
|
mwilliams7197/react-coinbase-commerce
|
https://api.github.com/repos/mwilliams7197/react-coinbase-commerce
|
opened
|
CVE-2018-11697 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2018-11697 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.9.2.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.9.2.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.9.2.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.9.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-coinbase-commerce/package.json</p>
<p>Path to vulnerable library: /react-coinbase-commerce/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.9.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/mwilliams7197/react-coinbase-commerce/commit/226ccc0cc2cdcf2c9b55d33b0f57d28b46d6103d">226ccc0cc2cdcf2c9b55d33b0f57d28b46d6103d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697>CVE-2018-11697</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11697">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11697</a></p>
<p>Release Date: 2019-09-01</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.9.2","isTransitiveDependency":false,"dependencyTree":"node-sass:4.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-11697","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-11697 (High) detected in multiple libraries - ## CVE-2018-11697 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.9.2.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.9.2.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.9.2.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.9.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-coinbase-commerce/package.json</p>
<p>Path to vulnerable library: /react-coinbase-commerce/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.9.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/mwilliams7197/react-coinbase-commerce/commit/226ccc0cc2cdcf2c9b55d33b0f57d28b46d6103d">226ccc0cc2cdcf2c9b55d33b0f57d28b46d6103d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697>CVE-2018-11697</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11697">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11697</a></p>
<p>Release Date: 2019-09-01</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.9.2","isTransitiveDependency":false,"dependencyTree":"node-sass:4.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-11697","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries node sass tgz node sass tgz wrapper around libsass library home page a href path to dependency file tmp ws scm react coinbase commerce package json path to vulnerable library react coinbase commerce node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href vulnerability details an issue was discovered in libsass through an out of bounds read of a memory region was found in the function sass prelexer exactly which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails an issue was discovered in libsass through an out of bounds read of a memory region was found in the function sass prelexer exactly which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service vulnerabilityurl
| 0
|
14,640
| 17,771,043,981
|
IssuesEvent
|
2021-08-30 13:42:20
|
googleapis/python-dlp
|
https://api.github.com/repos/googleapis/python-dlp
|
reopened
|
Dependency Dashboard
|
type: process api: dlp
|
This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/)
## Edited/Blocked
These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox.
- [ ] <!-- rebase-branch=renovate/google-cloud-pubsub-2.x -->[chore(deps): update dependency google-cloud-pubsub to v2.7.1](../pull/235)
- [ ] <!-- rebase-branch=renovate/google-cloud-bigquery-2.x -->[chore(deps): update dependency google-cloud-bigquery to v2.25.1](../pull/234)
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/)
## Edited/Blocked
These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox.
- [ ] <!-- rebase-branch=renovate/google-cloud-pubsub-2.x -->[chore(deps): update dependency google-cloud-pubsub to v2.7.1](../pull/235)
- [ ] <!-- rebase-branch=renovate/google-cloud-bigquery-2.x -->[chore(deps): update dependency google-cloud-bigquery to v2.25.1](../pull/234)
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue provides visibility into renovate updates and their statuses edited blocked these updates have been manually edited so renovate will no longer make changes to discard all commits and start over click on a checkbox pull pull check this box to trigger a request for renovate to run again on this repository
| 1
|
5,504
| 8,375,048,392
|
IssuesEvent
|
2018-10-05 15:18:12
|
googleapis/google-cloud-python-happybase
|
https://api.github.com/repos/googleapis/google-cloud-python-happybase
|
closed
|
Update pypi release for `pip install google-cloud-happybase`
|
type: process
|
The release present in pypi (https://pypi.org/project/google-cloud-happybase/) is severely outdated. Please make a release reflecting the changes in the bigtable api.
|
1.0
|
Update pypi release for `pip install google-cloud-happybase` - The release present in pypi (https://pypi.org/project/google-cloud-happybase/) is severely outdated. Please make a release reflecting the changes in the bigtable api.
|
process
|
update pypi release for pip install google cloud happybase the release present in pypi is severely outdated please make a release reflecting the changes in the bigtable api
| 1
|
202,280
| 23,076,940,878
|
IssuesEvent
|
2022-07-26 01:08:26
|
Rossb0b/Swapi
|
https://api.github.com/repos/Rossb0b/Swapi
|
opened
|
CVE-2021-35065 (High) detected in glob-parent-3.1.0.tgz, glob-parent-2.0.0.tgz
|
security vulnerability
|
## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-2.0.0.tgz</b></p></summary>
<p>
<details><summary><b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- karma-3.0.0.tgz (Root Library)
- chokidar-2.0.4.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>glob-parent-2.0.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-2.0.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-2.0.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: /node_modules/@angular/compiler-cli/node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- compiler-cli-7.0.4.tgz (Root Library)
- chokidar-1.7.0.tgz
- :x: **glob-parent-2.0.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-35065 (High) detected in glob-parent-3.1.0.tgz, glob-parent-2.0.0.tgz - ## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-2.0.0.tgz</b></p></summary>
<p>
<details><summary><b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- karma-3.0.0.tgz (Root Library)
- chokidar-2.0.4.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>glob-parent-2.0.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-2.0.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-2.0.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: /node_modules/@angular/compiler-cli/node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- compiler-cli-7.0.4.tgz (Root Library)
- chokidar-1.7.0.tgz
- :x: **glob-parent-2.0.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in glob parent tgz glob parent tgz cve high severity vulnerability vulnerable libraries glob parent tgz glob parent tgz glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file swapi package json path to vulnerable library node modules glob parent package json dependency hierarchy karma tgz root library chokidar tgz x glob parent tgz vulnerable library glob parent tgz strips glob magic from a string to provide the parent path library home page a href path to dependency file swapi package json path to vulnerable library node modules angular compiler cli node modules glob parent package json dependency hierarchy compiler cli tgz root library chokidar tgz x glob parent tgz vulnerable library vulnerability details the package glob parent before are vulnerable to regular expression denial of service redos publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with mend
| 0
|
731,549
| 25,222,032,627
|
IssuesEvent
|
2022-11-14 13:30:27
|
SAP/guided-answers-extension
|
https://api.github.com/repos/SAP/guided-answers-extension
|
closed
|
BUG - UX - Hover on the last node is wonked
|
type:bug guided-answers-extension bug-priority:medium
|
There's something wonky happening with hover on last node.
Expected result is that literally nothing happens. No hover effects, no mouse effects, nothing. It's just a plain old boring node. But what happens is that border changes, and there's a brief 1-frame, blitz of mouse cursor going to `pointer`. Can we please fix this?

@ritarora
|
1.0
|
BUG - UX - Hover on the last node is wonked - There's something wonky happening with hover on last node.
Expected result is that literally nothing happens. No hover effects, no mouse effects, nothing. It's just a plain old boring node. But what happens is that border changes, and there's a brief 1-frame, blitz of mouse cursor going to `pointer`. Can we please fix this?

@ritarora
|
non_process
|
bug ux hover on the last node is wonked there s something wonky happening with hover on last node expected result is that literally nothing happens no hover effects no mouse effects nothing it s just a plain old boring node but what happens is that border changes and there s a brief frame blitz of mouse cursor going to pointer can we please fix this ritarora
| 0
|
3,667
| 6,694,825,224
|
IssuesEvent
|
2017-10-10 04:47:03
|
york-region-tpss/stp
|
https://api.github.com/repos/york-region-tpss/stp
|
opened
|
Contractor Upload - Upload Data Validation
|
enhancement process workflow
|
Add validation based on a unique id for each tree watering assignment.
|
1.0
|
Contractor Upload - Upload Data Validation - Add validation based on a unique id for each tree watering assignment.
|
process
|
contractor upload upload data validation add validation based on a unique id for each tree watering assignment
| 1
|
470,207
| 13,534,394,888
|
IssuesEvent
|
2020-09-16 05:35:42
|
moibit/tracy-mobile-app
|
https://api.github.com/repos/moibit/tracy-mobile-app
|
closed
|
Login page has serious performance issue
|
PRIORITY-1 bug
|
When you try to login to app, it is not logging after several minutes. Strangely if you close the app and open it goes to landing page within few seconds!
|
1.0
|
Login page has serious performance issue - When you try to login to app, it is not logging after several minutes. Strangely if you close the app and open it goes to landing page within few seconds!
|
non_process
|
login page has serious performance issue when you try to login to app it is not logging after several minutes strangely if you close the app and open it goes to landing page within few seconds
| 0
|
10,979
| 13,782,315,422
|
IssuesEvent
|
2020-10-08 17:27:32
|
googleapis/python-datastore
|
https://api.github.com/repos/googleapis/python-datastore
|
closed
|
google.api_core.exceptions.ResourceExhausted: 429 Received message larger than max (11915495 vs. 4194304)
|
api: datastore external testing type: process
|
- Facing the error when try to run the `system tests` against the emulator.
- System tests are related to `LargeCharacterEntity`
- Related issue in pubsub https://github.com/googleapis/python-pubsub/issues/3
|
1.0
|
google.api_core.exceptions.ResourceExhausted: 429 Received message larger than max (11915495 vs. 4194304) - - Facing the error when try to run the `system tests` against the emulator.
- System tests are related to `LargeCharacterEntity`
- Related issue in pubsub https://github.com/googleapis/python-pubsub/issues/3
|
process
|
google api core exceptions resourceexhausted received message larger than max vs facing the error when try to run the system tests against the emulator system tests are related to largecharacterentity related issue in pubsub
| 1
|
5,342
| 8,170,256,079
|
IssuesEvent
|
2018-08-27 06:57:21
|
arxiv-vanity/engrafo
|
https://api.github.com/repos/arxiv-vanity/engrafo
|
closed
|
Put acknowledgements in Distill's appendix
|
area/postprocessor priority/low type/enhancement
|
If there is a heading called "acknowledgements" at the end, put it in <dt-appendix>. This miight be complicated if there is any block-level stuff in there because Distill doesn't apply formatting to its appendix. So, in that case, just skip it.
|
1.0
|
Put acknowledgements in Distill's appendix - If there is a heading called "acknowledgements" at the end, put it in <dt-appendix>. This miight be complicated if there is any block-level stuff in there because Distill doesn't apply formatting to its appendix. So, in that case, just skip it.
|
process
|
put acknowledgements in distill s appendix if there is a heading called acknowledgements at the end put it in this miight be complicated if there is any block level stuff in there because distill doesn t apply formatting to its appendix so in that case just skip it
| 1
|
21,603
| 30,005,478,440
|
IssuesEvent
|
2023-06-26 12:10:48
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[processor/k8sattributesprocessor] Add resource_attributes group in metadata.yaml.
|
processor/k8sattributes cmd/mdatagen
|
### Component(s)
cmd/mdatagen, processor/k8sattributes
### Describe the issue you're reporting
The group resource_attributes was introduced in the following [PR](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/21664).
Seeing that the `k8sattributesprocessor` already supports enabling/disabling the resource via a different configuration interface, the work here is adding the attributes to the `metadata.yaml` and documenting that these are exposed through a different configuration interface (as discussed [here](https://github.com/open-telemetry/opentelemetry-collector-contrib/issues/22997)).
|
1.0
|
[processor/k8sattributesprocessor] Add resource_attributes group in metadata.yaml. - ### Component(s)
cmd/mdatagen, processor/k8sattributes
### Describe the issue you're reporting
The group resource_attributes was introduced in the following [PR](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/21664).
Seeing that the `k8sattributesprocessor` already supports enabling/disabling the resource via a different configuration interface, the work here is adding the attributes to the `metadata.yaml` and documenting that these are exposed through a different configuration interface (as discussed [here](https://github.com/open-telemetry/opentelemetry-collector-contrib/issues/22997)).
|
process
|
add resource attributes group in metadata yaml component s cmd mdatagen processor describe the issue you re reporting the group resource attributes was introduced in the following seeing that the already supports enabling disabling the resource via a different configuration interface the work here is adding the attributes to the metadata yaml and documenting that these are exposed through a different configuration interface as discussed
| 1
|
217,187
| 24,320,413,042
|
IssuesEvent
|
2022-09-30 10:12:14
|
Nordix/Meridio
|
https://api.github.com/repos/Nordix/Meridio
|
closed
|
BGP md5 authentication
|
kind/enhancement concept/gateway area/security area/networking component/front-end
|
-per peer authentication
-support update of pre-sharedkey
BIRD:
> password string
>
> Use this password for MD5 authentication of BGP sessions ([RFC 2385](http://www.rfc-editor.org/info/rfc2385)). When used on BSD systems, see also setkey option below. Default: no authentication.
|
True
|
BGP md5 authentication - -per peer authentication
-support update of pre-sharedkey
BIRD:
> password string
>
> Use this password for MD5 authentication of BGP sessions ([RFC 2385](http://www.rfc-editor.org/info/rfc2385)). When used on BSD systems, see also setkey option below. Default: no authentication.
|
non_process
|
bgp authentication per peer authentication support update of pre sharedkey bird password string use this password for authentication of bgp sessions when used on bsd systems see also setkey option below default no authentication
| 0
|
18,198
| 10,025,231,455
|
IssuesEvent
|
2019-07-17 01:16:54
|
tensorflow/tensorflow
|
https://api.github.com/repos/tensorflow/tensorflow
|
closed
|
Error then try to feed sparse data to tensorflow dataset api
|
comp:data contrib stat:awaiting tensorflower type:bug/performance
|
Tensorflow 1.13.1
I have a sparse numpy array and trying to feed it to dataset api and dynamically pad in every batch.
```
import tensorflow as tf
import numpy as np
X = np.array([np.ones((np.random.randint(5, 10), 1)) for i in range(10)])
y = np.ones(10).reshape(-1, 1)
data = tf.data.Dataset.from_tensor_slices((X, y))
data = data.apply(tf.contrib.data.shuffle_and_repeat(buffer_size=2))
data = data.padded_batch(10, padded_shapes=([None, 1], []))
iterator = tf.data.Iterator.from_structure(data.output_types, data.output_shapes)
batch = iterator.get_next()
init_op = iterator.make_initializer(data)
with tf.Session() as sess:
sess.run(init_op)
batch_out = sess.run(batch)
print(batch_out)
```
But get error
```
Expected binary or unicode string, got array([[1.],
[1.],
[1.],
[1.],
[1.],
[1.],
[1.]])
```
|
True
|
Error then try to feed sparse data to tensorflow dataset api - Tensorflow 1.13.1
I have a sparse numpy array and trying to feed it to dataset api and dynamically pad in every batch.
```
import tensorflow as tf
import numpy as np
X = np.array([np.ones((np.random.randint(5, 10), 1)) for i in range(10)])
y = np.ones(10).reshape(-1, 1)
data = tf.data.Dataset.from_tensor_slices((X, y))
data = data.apply(tf.contrib.data.shuffle_and_repeat(buffer_size=2))
data = data.padded_batch(10, padded_shapes=([None, 1], []))
iterator = tf.data.Iterator.from_structure(data.output_types, data.output_shapes)
batch = iterator.get_next()
init_op = iterator.make_initializer(data)
with tf.Session() as sess:
sess.run(init_op)
batch_out = sess.run(batch)
print(batch_out)
```
But get error
```
Expected binary or unicode string, got array([[1.],
[1.],
[1.],
[1.],
[1.],
[1.],
[1.]])
```
|
non_process
|
error then try to feed sparse data to tensorflow dataset api tensorflow i have a sparse numpy array and trying to feed it to dataset api and dynamically pad in every batch import tensorflow as tf import numpy as np x np array y np ones reshape data tf data dataset from tensor slices x y data data apply tf contrib data shuffle and repeat buffer size data data padded batch padded shapes iterator tf data iterator from structure data output types data output shapes batch iterator get next init op iterator make initializer data with tf session as sess sess run init op batch out sess run batch print batch out but get error expected binary or unicode string got array
| 0
|
10,293
| 13,145,443,162
|
IssuesEvent
|
2020-08-08 03:30:50
|
elastic/beats
|
https://api.github.com/repos/elastic/beats
|
closed
|
Proposal docker prospector with processor and default behaviour
|
:Processors Filebeat Stalled enhancement needs_team
|
Currently an example config for the docker prospector looks as following:
```
- type: docker
paths:
- /var/lib/docker/containers/*/*.log
processors:
- add_docker_metadata: ~
```
If someone wants to get all logs, I suggest the following should be possible:
```
- type: docker
```
It would set automatically the above as the default path to be harvested. Currently a "no paths defined" error pops up.
The `add_docker_metadata` should be enabled by default. There must be a config option to disable it if not wanted.
@exekias WDYT?
|
1.0
|
Proposal docker prospector with processor and default behaviour - Currently an example config for the docker prospector looks as following:
```
- type: docker
paths:
- /var/lib/docker/containers/*/*.log
processors:
- add_docker_metadata: ~
```
If someone wants to get all logs, I suggest the following should be possible:
```
- type: docker
```
It would set automatically the above as the default path to be harvested. Currently a "no paths defined" error pops up.
The `add_docker_metadata` should be enabled by default. There must be a config option to disable it if not wanted.
@exekias WDYT?
|
process
|
proposal docker prospector with processor and default behaviour currently an example config for the docker prospector looks as following type docker paths var lib docker containers log processors add docker metadata if someone wants to get all logs i suggest the following should be possible type docker it would set automatically the above as the default path to be harvested currently a no paths defined error pops up the add docker metadata should be enabled by default there must be a config option to disable it if not wanted exekias wdyt
| 1
|
10,181
| 13,044,162,853
|
IssuesEvent
|
2020-07-29 03:47:37
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `IntervalInt` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `IntervalInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @iosmanthus
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `IntervalInt` from TiDB -
## Description
Port the scalar function `IntervalInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @iosmanthus
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function intervalint from tidb description port the scalar function intervalint from tidb to coprocessor score mentor s iosmanthus recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
8,141
| 11,349,857,584
|
IssuesEvent
|
2020-01-24 06:50:10
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
Obsolete GO:0044402 competition with other organism
|
multi-species process obsoletion
|
Dear all,
The proposal has been made to obsolete GO:0044402 competition with other organism. The reason for obsoletion is that this term is outside the scope of GO.
There are no annotations and not mappings to this term. This term is not present in any subsets.
Any comments can be added to the issue: https://github.com/geneontology/go-ontology/issues/18568
We are opening a comment period for this proposed obsoletion. We’d like to proceed and obsolete this term on January 24th, 2020. Unless objections are received by January 24th, 2020, we will assume that you agree to this change.
Thanks, Pascale
|
1.0
|
Obsolete GO:0044402 competition with other organism - Dear all,
The proposal has been made to obsolete GO:0044402 competition with other organism. The reason for obsoletion is that this term is outside the scope of GO.
There are no annotations and not mappings to this term. This term is not present in any subsets.
Any comments can be added to the issue: https://github.com/geneontology/go-ontology/issues/18568
We are opening a comment period for this proposed obsoletion. We’d like to proceed and obsolete this term on January 24th, 2020. Unless objections are received by January 24th, 2020, we will assume that you agree to this change.
Thanks, Pascale
|
process
|
obsolete go competition with other organism dear all the proposal has been made to obsolete go competition with other organism the reason for obsoletion is that this term is outside the scope of go there are no annotations and not mappings to this term this term is not present in any subsets any comments can be added to the issue we are opening a comment period for this proposed obsoletion we’d like to proceed and obsolete this term on january unless objections are received by january we will assume that you agree to this change thanks pascale
| 1
|
17,744
| 23,658,995,062
|
IssuesEvent
|
2022-08-26 13:55:15
|
streamnative/flink
|
https://api.github.com/repos/streamnative/flink
|
closed
|
[Pulsar IO] investigate jdbc-oracle connector
|
compute/data-processing
|
Currently jdbc does not include an Oracle connector, need to investigate what
|
1.0
|
[Pulsar IO] investigate jdbc-oracle connector - Currently jdbc does not include an Oracle connector, need to investigate what
|
process
|
investigate jdbc oracle connector currently jdbc does not include an oracle connector need to investigate what
| 1
|
7,481
| 10,573,638,612
|
IssuesEvent
|
2019-10-07 12:27:38
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
closed
|
inheritance doesnt work when opening a discussion from project
|
Fixed Process bug Projects
|
open a new project
assign different users as partners
go to meetings tab
click on the new meeting
create a new meetings
the partners dont transfer to the new meetings
|
1.0
|
inheritance doesnt work when opening a discussion from project - open a new project
assign different users as partners
go to meetings tab
click on the new meeting
create a new meetings
the partners dont transfer to the new meetings
|
process
|
inheritance doesnt work when opening a discussion from project open a new project assign different users as partners go to meetings tab click on the new meeting create a new meetings the partners dont transfer to the new meetings
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.