Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
91,124
| 8,290,247,196
|
IssuesEvent
|
2018-09-19 16:47:07
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Make flutter_tools tests work concurrently (without -j1)
|
a: tests
|
One thing that should speed up the bots (#20036) a little is running tests concurrently. We currently use `-j1` because many tests have race conditions when run concurrently (for example many set the currentDirectory on the filesystem).
This issue is to track discussions/fixes for this.
Tests that seem to be setting `fs.currentDirectory` that may need fixing:
- [x] `flutter_tester_test` (#21119)
- [x] `asset_bundle_package_font_test` (#21114)
- [x] `asset_bundle_package_test` (#21427)
- [x] `asset_bundle_variant_test` (#21426)
- [x] `asset_bundle_test` (#21425)
I've also opened #22037 with a change to make tests throw if they try setting `fs.currentDirectory`.
|
1.0
|
Make flutter_tools tests work concurrently (without -j1) - One thing that should speed up the bots (#20036) a little is running tests concurrently. We currently use `-j1` because many tests have race conditions when run concurrently (for example many set the currentDirectory on the filesystem).
This issue is to track discussions/fixes for this.
Tests that seem to be setting `fs.currentDirectory` that may need fixing:
- [x] `flutter_tester_test` (#21119)
- [x] `asset_bundle_package_font_test` (#21114)
- [x] `asset_bundle_package_test` (#21427)
- [x] `asset_bundle_variant_test` (#21426)
- [x] `asset_bundle_test` (#21425)
I've also opened #22037 with a change to make tests throw if they try setting `fs.currentDirectory`.
|
non_process
|
make flutter tools tests work concurrently without one thing that should speed up the bots a little is running tests concurrently we currently use because many tests have race conditions when run concurrently for example many set the currentdirectory on the filesystem this issue is to track discussions fixes for this tests that seem to be setting fs currentdirectory that may need fixing flutter tester test asset bundle package font test asset bundle package test asset bundle variant test asset bundle test i ve also opened with a change to make tests throw if they try setting fs currentdirectory
| 0
|
14,938
| 18,365,858,921
|
IssuesEvent
|
2021-10-10 02:55:10
|
varabyte/kobweb
|
https://api.github.com/repos/varabyte/kobweb
|
opened
|
Gradle plugin: Throw error if a user puts an index.html file in their public/ folder
|
process
|
This will complete with the one we will autogenerate. So they shouldn't do that!
|
1.0
|
Gradle plugin: Throw error if a user puts an index.html file in their public/ folder - This will complete with the one we will autogenerate. So they shouldn't do that!
|
process
|
gradle plugin throw error if a user puts an index html file in their public folder this will complete with the one we will autogenerate so they shouldn t do that
| 1
|
11,198
| 13,957,702,683
|
IssuesEvent
|
2020-10-24 08:13:40
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
MT: Harvest
|
Geoportal Harvesting process MT - Malta
|
Good Morning Angelo,
Kindly can you perform a harvest on the Malta CSW as we need to check some changes we performed.
Regards,
Rene
|
1.0
|
MT: Harvest - Good Morning Angelo,
Kindly can you perform a harvest on the Malta CSW as we need to check some changes we performed.
Regards,
Rene
|
process
|
mt harvest good morning angelo kindly can you perform a harvest on the malta csw as we need to check some changes we performed regards rene
| 1
|
10,299
| 13,152,016,452
|
IssuesEvent
|
2020-08-09 19:49:44
|
GoogleCloudPlatform/stackdriver-sandbox
|
https://api.github.com/repos/GoogleCloudPlatform/stackdriver-sandbox
|
closed
|
OpenTelemetry Tracing for ProductCatalogService
|
lang: go priority: p2 type: process
|
Subtask of #132
Use OpenTelemetry for tracing in the productcatalog service.
|
1.0
|
OpenTelemetry Tracing for ProductCatalogService - Subtask of #132
Use OpenTelemetry for tracing in the productcatalog service.
|
process
|
opentelemetry tracing for productcatalogservice subtask of use opentelemetry for tracing in the productcatalog service
| 1
|
603,975
| 18,675,015,554
|
IssuesEvent
|
2021-10-31 12:10:13
|
siteorigin/siteorigin-panels
|
https://api.github.com/repos/siteorigin/siteorigin-panels
|
closed
|
Add separate setting for panel bottom margin in Page Builder general options
|
enhancement priority-2
|
Hi there, this is a feature request.
I really enjoy working with SOPB however I often have to override via specific CSS rule the widget/panel's bottom margin to make it less than the row's bottom margin. Currently, the widget bottom margin is equal to the row's margin bottom.
I think it would be great to add a separate general setting for the widget/panel bottom margin in Settings > Page Builder.
Setting different margins would great for any website's UX because of basic design/element perception principles. The elements (panels) within a row would be closer to each other, meaning they're correlated one another. This perception would be comunicated at a first sight to the user, leading to a better experience and page readability. That's a basic gestalt psychology grouping mechanism (law of proximity).
What do you think?
|
1.0
|
Add separate setting for panel bottom margin in Page Builder general options - Hi there, this is a feature request.
I really enjoy working with SOPB however I often have to override via specific CSS rule the widget/panel's bottom margin to make it less than the row's bottom margin. Currently, the widget bottom margin is equal to the row's margin bottom.
I think it would be great to add a separate general setting for the widget/panel bottom margin in Settings > Page Builder.
Setting different margins would great for any website's UX because of basic design/element perception principles. The elements (panels) within a row would be closer to each other, meaning they're correlated one another. This perception would be comunicated at a first sight to the user, leading to a better experience and page readability. That's a basic gestalt psychology grouping mechanism (law of proximity).
What do you think?
|
non_process
|
add separate setting for panel bottom margin in page builder general options hi there this is a feature request i really enjoy working with sopb however i often have to override via specific css rule the widget panel s bottom margin to make it less than the row s bottom margin currently the widget bottom margin is equal to the row s margin bottom i think it would be great to add a separate general setting for the widget panel bottom margin in settings page builder setting different margins would great for any website s ux because of basic design element perception principles the elements panels within a row would be closer to each other meaning they re correlated one another this perception would be comunicated at a first sight to the user leading to a better experience and page readability that s a basic gestalt psychology grouping mechanism law of proximity what do you think
| 0
|
72,916
| 31,779,113,550
|
IssuesEvent
|
2023-09-12 16:08:56
|
openstreetmap/operations
|
https://api.github.com/repos/openstreetmap/operations
|
closed
|
Rename openstreetmap-fastly-processed-logs AWS S3 bucket
|
service:tiles location:aws
|
This bucket is not for processed logs, but used as an Athena workspace which saves query results, temp tables, etc. Its name should reflect the real usage.
|
1.0
|
Rename openstreetmap-fastly-processed-logs AWS S3 bucket - This bucket is not for processed logs, but used as an Athena workspace which saves query results, temp tables, etc. Its name should reflect the real usage.
|
non_process
|
rename openstreetmap fastly processed logs aws bucket this bucket is not for processed logs but used as an athena workspace which saves query results temp tables etc its name should reflect the real usage
| 0
|
7,986
| 10,147,122,056
|
IssuesEvent
|
2019-08-05 09:47:57
|
PG85/OpenTerrainGenerator
|
https://api.github.com/repos/PG85/OpenTerrainGenerator
|
closed
|
Primal core: endless loop! issue (otg.generator.resource.TreeGen.spawnInChunk)
|
Bug Forge Mod Compatibility Resource Spawning
|
World generator triggering an endless loop.
OpenTerrainGenerator-1.12.2 - v6
Biome_Bundle-1.12.2-v6.1
> Stacktrace:
at com.pg85.otg.generator.resource.TreeGen.spawnInChunk(TreeGen.java:153)
at com.pg85.otg.generator.resource.Resource.process(Resource.java:152)
at com.pg85.otg.generator.ObjectSpawner.populate(ObjectSpawner.java:259)
at com.pg85.otg.forge.generator.OTGChunkGenerator.func_185931_b(OTGChunkGenerator.java:203)
at net.minecraft.world.chunk.Chunk.func_186034_a(Chunk.java:1016)
at net.minecraft.world.chunk.Chunk.func_186030_a(Chunk.java:988)
at net.minecraft.world.gen.ChunkProviderServer.func_186025_d(ChunkProviderServer.java:157)
at net.minecraft.world.World.func_72964_e(World.java:309)
at net.minecraft.world.World.func_175726_f(World.java:304)
at net.minecraft.world.World.func_180495_p(World.java:910)
at net.minecraft.world.World.func_190529_b(World.java:580)
at net.minecraft.world.World.func_190522_c(World.java:478)
at net.minecraft.world.World.markAndNotifyBlock(World.java:389)
at net.minecraft.world.World.func_180501_a(World.java:360)
at nmd.primal.core.common.world.feature.GenMinableSubOre.func_180709_b(GenMinableSubOre.java:95)
at nmd.primal.core.common.world.WorldGenCommon.generate(WorldGenCommon.java:82)
at nmd.primal.core.common.world.generators.PrimalWorld.generate(PrimalWorld.java:41)
at net.minecraftforge.fml.common.registry.GameRegistry.generateWorld(GameRegistry.java:167)
at net.minecraft.world.chunk.Chunk.func_186034_a(Chunk.java:1017)
at net.minecraft.world.chunk.Chunk.func_186030_a(Chunk.java:988)
at net.minecraft.world.gen.ChunkProviderServer.func_186025_d(ChunkProviderServer.java:157)
at net.minecraft.server.management.PlayerChunkMapEntry.func_187268_a(PlayerChunkMapEntry.java:126)
at net.minecraft.server.management.PlayerChunkMap.func_72693_b(SourceFile:147)
at net.minecraft.world.WorldServer.func_72835_b(WorldServer.java:227)
>
|
True
|
Primal core: endless loop! issue (otg.generator.resource.TreeGen.spawnInChunk) - World generator triggering an endless loop.
OpenTerrainGenerator-1.12.2 - v6
Biome_Bundle-1.12.2-v6.1
> Stacktrace:
at com.pg85.otg.generator.resource.TreeGen.spawnInChunk(TreeGen.java:153)
at com.pg85.otg.generator.resource.Resource.process(Resource.java:152)
at com.pg85.otg.generator.ObjectSpawner.populate(ObjectSpawner.java:259)
at com.pg85.otg.forge.generator.OTGChunkGenerator.func_185931_b(OTGChunkGenerator.java:203)
at net.minecraft.world.chunk.Chunk.func_186034_a(Chunk.java:1016)
at net.minecraft.world.chunk.Chunk.func_186030_a(Chunk.java:988)
at net.minecraft.world.gen.ChunkProviderServer.func_186025_d(ChunkProviderServer.java:157)
at net.minecraft.world.World.func_72964_e(World.java:309)
at net.minecraft.world.World.func_175726_f(World.java:304)
at net.minecraft.world.World.func_180495_p(World.java:910)
at net.minecraft.world.World.func_190529_b(World.java:580)
at net.minecraft.world.World.func_190522_c(World.java:478)
at net.minecraft.world.World.markAndNotifyBlock(World.java:389)
at net.minecraft.world.World.func_180501_a(World.java:360)
at nmd.primal.core.common.world.feature.GenMinableSubOre.func_180709_b(GenMinableSubOre.java:95)
at nmd.primal.core.common.world.WorldGenCommon.generate(WorldGenCommon.java:82)
at nmd.primal.core.common.world.generators.PrimalWorld.generate(PrimalWorld.java:41)
at net.minecraftforge.fml.common.registry.GameRegistry.generateWorld(GameRegistry.java:167)
at net.minecraft.world.chunk.Chunk.func_186034_a(Chunk.java:1017)
at net.minecraft.world.chunk.Chunk.func_186030_a(Chunk.java:988)
at net.minecraft.world.gen.ChunkProviderServer.func_186025_d(ChunkProviderServer.java:157)
at net.minecraft.server.management.PlayerChunkMapEntry.func_187268_a(PlayerChunkMapEntry.java:126)
at net.minecraft.server.management.PlayerChunkMap.func_72693_b(SourceFile:147)
at net.minecraft.world.WorldServer.func_72835_b(WorldServer.java:227)
>
|
non_process
|
primal core endless loop issue otg generator resource treegen spawninchunk world generator triggering an endless loop openterraingenerator biome bundle stacktrace at com otg generator resource treegen spawninchunk treegen java at com otg generator resource resource process resource java at com otg generator objectspawner populate objectspawner java at com otg forge generator otgchunkgenerator func b otgchunkgenerator java at net minecraft world chunk chunk func a chunk java at net minecraft world chunk chunk func a chunk java at net minecraft world gen chunkproviderserver func d chunkproviderserver java at net minecraft world world func e world java at net minecraft world world func f world java at net minecraft world world func p world java at net minecraft world world func b world java at net minecraft world world func c world java at net minecraft world world markandnotifyblock world java at net minecraft world world func a world java at nmd primal core common world feature genminablesubore func b genminablesubore java at nmd primal core common world worldgencommon generate worldgencommon java at nmd primal core common world generators primalworld generate primalworld java at net minecraftforge fml common registry gameregistry generateworld gameregistry java at net minecraft world chunk chunk func a chunk java at net minecraft world chunk chunk func a chunk java at net minecraft world gen chunkproviderserver func d chunkproviderserver java at net minecraft server management playerchunkmapentry func a playerchunkmapentry java at net minecraft server management playerchunkmap func b sourcefile at net minecraft world worldserver func b worldserver java
| 0
|
17,558
| 10,082,742,412
|
IssuesEvent
|
2019-07-25 12:04:14
|
senthilbalakrishnanfull/testing
|
https://api.github.com/repos/senthilbalakrishnanfull/testing
|
opened
|
CVE-2018-11695 (High) detected in opennms-opennms-source-23.0.0-1
|
security vulnerability
|
## CVE-2018-11695 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/senthilbalakrishnanfull/testing/commit/b01154f4f2a0d62cb86b20c539e5c9514f09efac">b01154f4f2a0d62cb86b20c539e5c9514f09efac</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (68)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /testing/node_modules/node-sass/src/libsass/src/expand.hpp
- /testing/node_modules/node-sass/src/libsass/src/expand.cpp
- /testing/node_modules/node-sass/src/sass_types/factory.cpp
- /testing/node_modules/node-sass/src/sass_types/boolean.cpp
- /testing/node_modules/node-sass/src/libsass/src/util.hpp
- /testing/node_modules/node-sass/src/sass_types/value.h
- /testing/node_modules/node-sass/src/libsass/src/emitter.hpp
- /testing/node_modules/node-sass/src/libsass/src/lexer.cpp
- /testing/node_modules/node-sass/src/callback_bridge.h
- /testing/node_modules/node-sass/src/libsass/src/file.cpp
- /testing/node_modules/node-sass/src/libsass/src/sass.cpp
- /testing/node_modules/node-sass/src/libsass/src/operation.hpp
- /testing/node_modules/node-sass/src/libsass/src/operators.hpp
- /testing/node_modules/node-sass/src/libsass/src/constants.hpp
- /testing/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /testing/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp
- /testing/node_modules/node-sass/src/custom_importer_bridge.cpp
- /testing/node_modules/node-sass/src/libsass/src/parser.hpp
- /testing/node_modules/node-sass/src/libsass/src/constants.cpp
- /testing/node_modules/node-sass/src/sass_types/list.cpp
- /testing/node_modules/node-sass/src/libsass/src/cssize.cpp
- /testing/node_modules/node-sass/src/libsass/src/functions.hpp
- /testing/node_modules/node-sass/src/libsass/src/util.cpp
- /testing/node_modules/node-sass/src/custom_function_bridge.cpp
- /testing/node_modules/node-sass/src/custom_importer_bridge.h
- /testing/node_modules/node-sass/src/libsass/src/bind.cpp
- /testing/node_modules/node-sass/src/libsass/src/eval.hpp
- /testing/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /testing/node_modules/node-sass/src/libsass/src/extend.cpp
- /testing/node_modules/node-sass/src/sass_context_wrapper.h
- /testing/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /testing/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /testing/node_modules/node-sass/src/libsass/src/node.cpp
- /testing/node_modules/node-sass/src/libsass/src/debugger.hpp
- /testing/node_modules/node-sass/src/libsass/src/emitter.cpp
- /testing/node_modules/node-sass/src/sass_types/number.cpp
- /testing/node_modules/node-sass/src/sass_types/color.h
- /testing/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast.hpp
- /testing/node_modules/node-sass/src/libsass/src/output.cpp
- /testing/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /testing/node_modules/node-sass/src/sass_types/null.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /testing/node_modules/node-sass/src/libsass/src/functions.cpp
- /testing/node_modules/node-sass/src/libsass/src/cssize.hpp
- /testing/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_c.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_value.hpp
- /testing/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /testing/node_modules/node-sass/src/libsass/src/inspect.hpp
- /testing/node_modules/node-sass/src/sass_types/color.cpp
- /testing/node_modules/node-sass/src/libsass/src/values.cpp
- /testing/node_modules/node-sass/src/sass_context_wrapper.cpp
- /testing/node_modules/node-sass/src/sass_types/list.h
- /testing/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp
- /testing/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /testing/node_modules/node-sass/src/libsass/src/to_c.hpp
- /testing/node_modules/node-sass/src/sass_types/map.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_value.cpp
- /testing/node_modules/node-sass/src/libsass/src/context.cpp
- /testing/node_modules/node-sass/src/libsass/src/listize.hpp
- /testing/node_modules/node-sass/src/sass_types/string.cpp
- /testing/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /testing/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /testing/node_modules/node-sass/src/libsass/src/context.hpp
- /testing/node_modules/node-sass/src/sass_types/boolean.h
- /testing/node_modules/node-sass/src/libsass/src/eval.cpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.2. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695>CVE-2018-11695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-11695 (High) detected in opennms-opennms-source-23.0.0-1 - ## CVE-2018-11695 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/senthilbalakrishnanfull/testing/commit/b01154f4f2a0d62cb86b20c539e5c9514f09efac">b01154f4f2a0d62cb86b20c539e5c9514f09efac</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (68)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /testing/node_modules/node-sass/src/libsass/src/expand.hpp
- /testing/node_modules/node-sass/src/libsass/src/expand.cpp
- /testing/node_modules/node-sass/src/sass_types/factory.cpp
- /testing/node_modules/node-sass/src/sass_types/boolean.cpp
- /testing/node_modules/node-sass/src/libsass/src/util.hpp
- /testing/node_modules/node-sass/src/sass_types/value.h
- /testing/node_modules/node-sass/src/libsass/src/emitter.hpp
- /testing/node_modules/node-sass/src/libsass/src/lexer.cpp
- /testing/node_modules/node-sass/src/callback_bridge.h
- /testing/node_modules/node-sass/src/libsass/src/file.cpp
- /testing/node_modules/node-sass/src/libsass/src/sass.cpp
- /testing/node_modules/node-sass/src/libsass/src/operation.hpp
- /testing/node_modules/node-sass/src/libsass/src/operators.hpp
- /testing/node_modules/node-sass/src/libsass/src/constants.hpp
- /testing/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /testing/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp
- /testing/node_modules/node-sass/src/custom_importer_bridge.cpp
- /testing/node_modules/node-sass/src/libsass/src/parser.hpp
- /testing/node_modules/node-sass/src/libsass/src/constants.cpp
- /testing/node_modules/node-sass/src/sass_types/list.cpp
- /testing/node_modules/node-sass/src/libsass/src/cssize.cpp
- /testing/node_modules/node-sass/src/libsass/src/functions.hpp
- /testing/node_modules/node-sass/src/libsass/src/util.cpp
- /testing/node_modules/node-sass/src/custom_function_bridge.cpp
- /testing/node_modules/node-sass/src/custom_importer_bridge.h
- /testing/node_modules/node-sass/src/libsass/src/bind.cpp
- /testing/node_modules/node-sass/src/libsass/src/eval.hpp
- /testing/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /testing/node_modules/node-sass/src/libsass/src/extend.cpp
- /testing/node_modules/node-sass/src/sass_context_wrapper.h
- /testing/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /testing/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /testing/node_modules/node-sass/src/libsass/src/node.cpp
- /testing/node_modules/node-sass/src/libsass/src/debugger.hpp
- /testing/node_modules/node-sass/src/libsass/src/emitter.cpp
- /testing/node_modules/node-sass/src/sass_types/number.cpp
- /testing/node_modules/node-sass/src/sass_types/color.h
- /testing/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast.hpp
- /testing/node_modules/node-sass/src/libsass/src/output.cpp
- /testing/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /testing/node_modules/node-sass/src/sass_types/null.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /testing/node_modules/node-sass/src/libsass/src/functions.cpp
- /testing/node_modules/node-sass/src/libsass/src/cssize.hpp
- /testing/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /testing/node_modules/node-sass/src/libsass/src/ast.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_c.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_value.hpp
- /testing/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /testing/node_modules/node-sass/src/libsass/src/inspect.hpp
- /testing/node_modules/node-sass/src/sass_types/color.cpp
- /testing/node_modules/node-sass/src/libsass/src/values.cpp
- /testing/node_modules/node-sass/src/sass_context_wrapper.cpp
- /testing/node_modules/node-sass/src/sass_types/list.h
- /testing/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp
- /testing/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /testing/node_modules/node-sass/src/libsass/src/to_c.hpp
- /testing/node_modules/node-sass/src/sass_types/map.cpp
- /testing/node_modules/node-sass/src/libsass/src/to_value.cpp
- /testing/node_modules/node-sass/src/libsass/src/context.cpp
- /testing/node_modules/node-sass/src/libsass/src/listize.hpp
- /testing/node_modules/node-sass/src/sass_types/string.cpp
- /testing/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /testing/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /testing/node_modules/node-sass/src/libsass/src/context.hpp
- /testing/node_modules/node-sass/src/sass_types/boolean.h
- /testing/node_modules/node-sass/src/libsass/src/eval.cpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.2. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695>CVE-2018-11695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in opennms opennms source cve high severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries testing node modules node sass src libsass src expand hpp testing node modules node sass src libsass src expand cpp testing node modules node sass src sass types factory cpp testing node modules node sass src sass types boolean cpp testing node modules node sass src libsass src util hpp testing node modules node sass src sass types value h testing node modules node sass src libsass src emitter hpp testing node modules node sass src libsass src lexer cpp testing node modules node sass src callback bridge h testing node modules node sass src libsass src file cpp testing node modules node sass src libsass src sass cpp testing node modules node sass src libsass src operation hpp testing node modules node sass src libsass src operators hpp testing node modules node sass src libsass src constants hpp testing node modules node sass src libsass src error handling hpp testing node modules node sass src libsass src ast fwd decl cpp testing node modules node sass src custom importer bridge cpp testing node modules node sass src libsass src parser hpp testing node modules node sass src libsass src constants cpp testing node modules node sass src sass types list cpp testing node modules node sass src libsass src cssize cpp testing node modules node sass src libsass src functions hpp testing node modules node sass src libsass src util cpp testing node modules node sass src custom function bridge cpp testing node modules node sass src custom importer bridge h testing node modules node sass src libsass src bind cpp testing node modules node sass src libsass src eval hpp testing node modules node sass src libsass src backtrace cpp testing node modules node sass src libsass src extend cpp testing node modules node sass src sass context wrapper h testing node modules node sass src sass types sass value wrapper h testing node modules node sass src libsass src error handling cpp testing node modules node sass src libsass src node cpp testing node modules node sass src libsass src debugger hpp testing node modules node sass src libsass src emitter cpp testing node modules node sass src sass types number cpp testing node modules node sass src sass types color h testing node modules node sass src libsass src sass values cpp testing node modules node sass src libsass src ast hpp testing node modules node sass src libsass src output cpp testing node modules node sass src libsass src check nesting cpp testing node modules node sass src sass types null cpp testing node modules node sass src libsass src ast def macros hpp testing node modules node sass src libsass src functions cpp testing node modules node sass src libsass src cssize hpp testing node modules node sass src libsass src prelexer cpp testing node modules node sass src libsass src ast cpp testing node modules node sass src libsass src to c cpp testing node modules node sass src libsass src to value hpp testing node modules node sass src libsass src ast fwd decl hpp testing node modules node sass src libsass src inspect hpp testing node modules node sass src sass types color cpp testing node modules node sass src libsass src values cpp testing node modules node sass src sass context wrapper cpp testing node modules node sass src sass types list h testing node modules node sass src libsass src memory sharedptr hpp testing node modules node sass src libsass src check nesting hpp testing node modules node sass src libsass src to c hpp testing node modules node sass src sass types map cpp testing node modules node sass src libsass src to value cpp testing node modules node sass src libsass src context cpp testing node modules node sass src libsass src listize hpp testing node modules node sass src sass types string cpp testing node modules node sass src libsass src sass context cpp testing node modules node sass src libsass src prelexer hpp testing node modules node sass src libsass src context hpp testing node modules node sass src sass types boolean h testing node modules node sass src libsass src eval cpp vulnerability details an issue was discovered in libsass through a null pointer dereference was found in the function sass expand operator which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with whitesource
| 0
|
22,750
| 3,794,419,563
|
IssuesEvent
|
2016-03-22 16:51:43
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
opened
|
Double shadow flash when moving component list underneath app bar
|
affects: material design customer: gallery ⚠ bug
|
Scrolling the component list up under the app bar flashes the shadow at the border between the app bar and the component list.
Movie: https://dl.dropboxusercontent.com/u/316685/RECORDING.mp4
|
1.0
|
Double shadow flash when moving component list underneath app bar - Scrolling the component list up under the app bar flashes the shadow at the border between the app bar and the component list.
Movie: https://dl.dropboxusercontent.com/u/316685/RECORDING.mp4
|
non_process
|
double shadow flash when moving component list underneath app bar scrolling the component list up under the app bar flashes the shadow at the border between the app bar and the component list movie
| 0
|
449,898
| 31,877,440,185
|
IssuesEvent
|
2023-09-16 01:44:49
|
ossf/scorecard
|
https://api.github.com/repos/ossf/scorecard
|
closed
|
Feature: Improve docs on using package manager flags
|
documentation enhancement no-issue-activity
|
**Is your feature request related to a problem? Please describe.**
Scorecard can receive as input the name of the package from `npm`, `pypi` and `rubygems` ecosystems as per [the documentation](https://github.com/ossf/scorecard/blob/4cd5446862ea4c470810fea81fc7f45a36d04dec/README.md?plain=1#L423-L427). It is unclear to me if using such flags changes the evaluation of the package to be more specific to XYZ ecosystem, or if it's only used to find the repository source through the package manager. And it's also unclear that such flags cannot be used along with `--repo` flag before getting an error.
**Describe the solution you'd like**
I would like the documentation section `Using a Package manager` to explain if using the package ecosystem flags affect the final evaluation or not and that such flags cannot be used along with `--repo` flag.
**Describe alternatives you've considered**
None.
**Additional context**
None.
|
1.0
|
Feature: Improve docs on using package manager flags - **Is your feature request related to a problem? Please describe.**
Scorecard can receive as input the name of the package from `npm`, `pypi` and `rubygems` ecosystems as per [the documentation](https://github.com/ossf/scorecard/blob/4cd5446862ea4c470810fea81fc7f45a36d04dec/README.md?plain=1#L423-L427). It is unclear to me if using such flags changes the evaluation of the package to be more specific to XYZ ecosystem, or if it's only used to find the repository source through the package manager. And it's also unclear that such flags cannot be used along with `--repo` flag before getting an error.
**Describe the solution you'd like**
I would like the documentation section `Using a Package manager` to explain if using the package ecosystem flags affect the final evaluation or not and that such flags cannot be used along with `--repo` flag.
**Describe alternatives you've considered**
None.
**Additional context**
None.
|
non_process
|
feature improve docs on using package manager flags is your feature request related to a problem please describe scorecard can receive as input the name of the package from npm pypi and rubygems ecosystems as per it is unclear to me if using such flags changes the evaluation of the package to be more specific to xyz ecosystem or if it s only used to find the repository source through the package manager and it s also unclear that such flags cannot be used along with repo flag before getting an error describe the solution you d like i would like the documentation section using a package manager to explain if using the package ecosystem flags affect the final evaluation or not and that such flags cannot be used along with repo flag describe alternatives you ve considered none additional context none
| 0
|
40,817
| 10,168,215,486
|
IssuesEvent
|
2019-08-07 20:14:14
|
USDepartmentofLabor/OCIO-DOLSafety-iOS
|
https://api.github.com/repos/USDepartmentofLabor/OCIO-DOLSafety-iOS
|
closed
|
Functional - Resources Screen - Address Punctuation Needs Fixes
|
Fixed defect
|
For the second line of the address, a comma is needed after “Washington” along with a period after in the “D.C” or remove the first one.
Please see the attached screenshot.

|
1.0
|
Functional - Resources Screen - Address Punctuation Needs Fixes - For the second line of the address, a comma is needed after “Washington” along with a period after in the “D.C” or remove the first one.
Please see the attached screenshot.

|
non_process
|
functional resources screen address punctuation needs fixes for the second line of the address a comma is needed after “washington” along with a period after in the “d c” or remove the first one please see the attached screenshot
| 0
|
105,154
| 16,624,262,132
|
IssuesEvent
|
2021-06-03 07:35:38
|
Thanraj/OpenSSL_1.0.1q
|
https://api.github.com/repos/Thanraj/OpenSSL_1.0.1q
|
opened
|
CVE-2016-0797 (High) detected in opensslOpenSSL_1_0_1q, opensslOpenSSL_1_0_1q
|
security vulnerability
|
## CVE-2016-0797 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>opensslOpenSSL_1_0_1q</b>, <b>opensslOpenSSL_1_0_1q</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Multiple integer overflows in OpenSSL 1.0.1 before 1.0.1s and 1.0.2 before 1.0.2g allow remote attackers to cause a denial of service (heap memory corruption or NULL pointer dereference) or possibly have unspecified other impact via a long digit string that is mishandled by the (1) BN_dec2bn or (2) BN_hex2bn function, related to crypto/bn/bn.h and crypto/bn/bn_print.c.
<p>Publish Date: 2016-03-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-0797>CVE-2016-0797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-0797">https://nvd.nist.gov/vuln/detail/CVE-2016-0797</a></p>
<p>Release Date: 2016-03-03</p>
<p>Fix Resolution: 1.0.1s,1.0.2g</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2016-0797 (High) detected in opensslOpenSSL_1_0_1q, opensslOpenSSL_1_0_1q - ## CVE-2016-0797 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>opensslOpenSSL_1_0_1q</b>, <b>opensslOpenSSL_1_0_1q</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Multiple integer overflows in OpenSSL 1.0.1 before 1.0.1s and 1.0.2 before 1.0.2g allow remote attackers to cause a denial of service (heap memory corruption or NULL pointer dereference) or possibly have unspecified other impact via a long digit string that is mishandled by the (1) BN_dec2bn or (2) BN_hex2bn function, related to crypto/bn/bn.h and crypto/bn/bn_print.c.
<p>Publish Date: 2016-03-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-0797>CVE-2016-0797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-0797">https://nvd.nist.gov/vuln/detail/CVE-2016-0797</a></p>
<p>Release Date: 2016-03-03</p>
<p>Fix Resolution: 1.0.1s,1.0.2g</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in opensslopenssl opensslopenssl cve high severity vulnerability vulnerable libraries opensslopenssl opensslopenssl vulnerability details multiple integer overflows in openssl before and before allow remote attackers to cause a denial of service heap memory corruption or null pointer dereference or possibly have unspecified other impact via a long digit string that is mishandled by the bn or bn function related to crypto bn bn h and crypto bn bn print c publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
631,320
| 20,150,489,013
|
IssuesEvent
|
2022-02-09 11:53:12
|
pulibrary/orangelight
|
https://api.github.com/repos/pulibrary/orangelight
|
closed
|
Downcase what we get from cas
|
high-priority
|
We notice that if we submit a net id with a capitalized letter through CAS, CAS will not downcase it.
|
1.0
|
Downcase what we get from cas - We notice that if we submit a net id with a capitalized letter through CAS, CAS will not downcase it.
|
non_process
|
downcase what we get from cas we notice that if we submit a net id with a capitalized letter through cas cas will not downcase it
| 0
|
72,446
| 9,593,376,269
|
IssuesEvent
|
2019-05-09 11:22:02
|
bounswe/bounswe2019group9
|
https://api.github.com/repos/bounswe/bounswe2019group9
|
closed
|
Evaluate requirements and mockups
|
status : Not Started Yet type : documentation urgency : high
|
Evaluate it in a few (points)sentences and write it to the related subtitle in the "Evaluation of Deliverables" part in milestone report.
- [ ] Halit
- [x] Emirhan
- [ ] Egemen
- [x] Ali Ramazan
|
1.0
|
Evaluate requirements and mockups - Evaluate it in a few (points)sentences and write it to the related subtitle in the "Evaluation of Deliverables" part in milestone report.
- [ ] Halit
- [x] Emirhan
- [ ] Egemen
- [x] Ali Ramazan
|
non_process
|
evaluate requirements and mockups evaluate it in a few points sentences and write it to the related subtitle in the evaluation of deliverables part in milestone report halit emirhan egemen ali ramazan
| 0
|
11,542
| 8,407,106,924
|
IssuesEvent
|
2018-10-11 19:55:57
|
mycelium-com/wallet-android
|
https://api.github.com/repos/mycelium-com/wallet-android
|
opened
|
Encrypted backup of non-masterseed derived data
|
enhancement security
|
Currently, migrating from one phone to another with only the 12 word backup comes at a loss of metadata and absent of some extra work even at the loss of accounts. In line with BIP44, Mycelium does not explore accounts, so users are left to (re)create them on the new device and same goes with the accounts covered by the masterseed in extension to BIP44, namely the coinapult accounts. Lastly, on top of that meta data, there are unrelated accounts that are not covered by the 12 words backup. The current "solution" of [BIP38](https://github.com/bitcoin/bips/blob/master/bip-0038.mediawiki) encrypted keys is cumbersome and users can lose these backups or do them wrongly too easily. They are required to create a pdf, **print it**, **write a key on it** and never lose it.
If we trust our cryptography, we should be able to do better. In order to have no security degradation I propose to use the same primitives as in BIP38 but with a symmetric key derived from the masterseed to store the necessary encrypted data at a place of the user's choice (just like the legacy backup pdf) or propose to store it on our servers or other services (google drive, dropbox, ...).
Things to store:
* address book
* transaction labels
* BIP70 payment requests
* list of activated/archived account indices
* account labels
* unrelated xpriv accounts
* unrelated single key accounts
* state of Coinapult activation
* (date of backup)
Workflow
=======
Format
---------
JSON
Backup
----------
* A user who never did this kind of backup should be presented a list of options of where to store backups.
Services that can then work without further user interaction after an initial setup should be "recommended"
* Trigger the backup mechanism every time any of the backupable data changes.
* If a service that can work automatically was selected, store backup.
* Else, show missing backup warning.
Restore
----------
* When a user restores an account from his 12 words backup, ask him if he might have a backup.
* Allow users to load backups from settings menu, in case they remember later.
Further thoughts
----------------------
* Users paying BIP70 invoices a lot might need to store more data than others and frequently.
* We might want to speed backup up by splitting the backup into many smaller files.
Related issues
===========
* #124 is about exporting things for use in Excel or do bookkeeping.
* #298 is about exporting/importing non-private key material in unencrypted form.
|
True
|
Encrypted backup of non-masterseed derived data - Currently, migrating from one phone to another with only the 12 word backup comes at a loss of metadata and absent of some extra work even at the loss of accounts. In line with BIP44, Mycelium does not explore accounts, so users are left to (re)create them on the new device and same goes with the accounts covered by the masterseed in extension to BIP44, namely the coinapult accounts. Lastly, on top of that meta data, there are unrelated accounts that are not covered by the 12 words backup. The current "solution" of [BIP38](https://github.com/bitcoin/bips/blob/master/bip-0038.mediawiki) encrypted keys is cumbersome and users can lose these backups or do them wrongly too easily. They are required to create a pdf, **print it**, **write a key on it** and never lose it.
If we trust our cryptography, we should be able to do better. In order to have no security degradation I propose to use the same primitives as in BIP38 but with a symmetric key derived from the masterseed to store the necessary encrypted data at a place of the user's choice (just like the legacy backup pdf) or propose to store it on our servers or other services (google drive, dropbox, ...).
Things to store:
* address book
* transaction labels
* BIP70 payment requests
* list of activated/archived account indices
* account labels
* unrelated xpriv accounts
* unrelated single key accounts
* state of Coinapult activation
* (date of backup)
Workflow
=======
Format
---------
JSON
Backup
----------
* A user who never did this kind of backup should be presented a list of options of where to store backups.
Services that can then work without further user interaction after an initial setup should be "recommended"
* Trigger the backup mechanism every time any of the backupable data changes.
* If a service that can work automatically was selected, store backup.
* Else, show missing backup warning.
Restore
----------
* When a user restores an account from his 12 words backup, ask him if he might have a backup.
* Allow users to load backups from settings menu, in case they remember later.
Further thoughts
----------------------
* Users paying BIP70 invoices a lot might need to store more data than others and frequently.
* We might want to speed backup up by splitting the backup into many smaller files.
Related issues
===========
* #124 is about exporting things for use in Excel or do bookkeeping.
* #298 is about exporting/importing non-private key material in unencrypted form.
|
non_process
|
encrypted backup of non masterseed derived data currently migrating from one phone to another with only the word backup comes at a loss of metadata and absent of some extra work even at the loss of accounts in line with mycelium does not explore accounts so users are left to re create them on the new device and same goes with the accounts covered by the masterseed in extension to namely the coinapult accounts lastly on top of that meta data there are unrelated accounts that are not covered by the words backup the current solution of encrypted keys is cumbersome and users can lose these backups or do them wrongly too easily they are required to create a pdf print it write a key on it and never lose it if we trust our cryptography we should be able to do better in order to have no security degradation i propose to use the same primitives as in but with a symmetric key derived from the masterseed to store the necessary encrypted data at a place of the user s choice just like the legacy backup pdf or propose to store it on our servers or other services google drive dropbox things to store address book transaction labels payment requests list of activated archived account indices account labels unrelated xpriv accounts unrelated single key accounts state of coinapult activation date of backup workflow format json backup a user who never did this kind of backup should be presented a list of options of where to store backups services that can then work without further user interaction after an initial setup should be recommended trigger the backup mechanism every time any of the backupable data changes if a service that can work automatically was selected store backup else show missing backup warning restore when a user restores an account from his words backup ask him if he might have a backup allow users to load backups from settings menu in case they remember later further thoughts users paying invoices a lot might need to store more data than others and frequently we might want to speed backup up by splitting the backup into many smaller files related issues is about exporting things for use in excel or do bookkeeping is about exporting importing non private key material in unencrypted form
| 0
|
36,914
| 6,557,447,818
|
IssuesEvent
|
2017-09-06 17:28:14
|
CoraleStudios/Colore
|
https://api.github.com/repos/CoraleStudios/Colore
|
closed
|
List of games using Colore
|
Documentation Idea In progress
|
We should actively list and promote the games which use the Colore library.
The ones I know so far are:
[](http://store.steampowered.com/app/290000/) [](http://store.steampowered.com/app/459090/) [](http://store.steampowered.com/app/342260/)
[](http://store.steampowered.com/app/529590/) [](http://store.steampowered.com/app/318970/) [](http://store.steampowered.com/app/423590/)
And yes, that took longer than i thought making it all pretty.
|
1.0
|
List of games using Colore - We should actively list and promote the games which use the Colore library.
The ones I know so far are:
[](http://store.steampowered.com/app/290000/) [](http://store.steampowered.com/app/459090/) [](http://store.steampowered.com/app/342260/)
[](http://store.steampowered.com/app/529590/) [](http://store.steampowered.com/app/318970/) [](http://store.steampowered.com/app/423590/)
And yes, that took longer than i thought making it all pretty.
|
non_process
|
list of games using colore we should actively list and promote the games which use the colore library the ones i know so far are and yes that took longer than i thought making it all pretty
| 0
|
32,270
| 6,756,696,827
|
IssuesEvent
|
2017-10-24 08:10:12
|
primefaces/primeng
|
https://api.github.com/repos/primefaces/primeng
|
closed
|
MegaMenu doesn't compile with TypeScript 2.4
|
confirmed defect
|
**I'm submitting a ...**
```
[X] bug report
```
**Test case**
You can use the following demo app as test case:
https://github.com/ova2/angular-development-with-primeng/tree/master/chapter7/megamenu
**Current behavior**
If you run the showcase for the MegaMenu with TypeScript 2.4 or run the demo app linked above, you will get a compilation error like
```
Type '{ label: string; items: { label: string; }[]; }[]' has no properties in common with type 'MenuItem'.
```
**Expected behavior**
There should be no compilation error, as when compiling with TypeScript 2.3.
**Minimal reproduction of the problem with instructions**
* Install the above app
* or install the current master of PrimeNG and change the requirements in package.json to Angular 4.3, Angular-Cli 1.3, TypeScript 2.4 (lower Angular/Cli versions require TypeScript 2.3, so you need to test with Angular 4.3 and Cli 1.3)
* Run `npm` install and `npm start` and check the MegaMenu
* **Angular version:** 4.3.3
* **PrimeNG version:** 4.1.2
* **Browser:** all
* **Language:** TypeScript 2.4
* **Node (for AoT issues):** 8.1.4
* **Analysis of the problem:**
In the `MenuItem` interface, the `items` property is defined as of type `MenuItem[]`. But in the MegaMenu, you can have arrays of arrays of MenuItems as items, not just arrays of MenuItems.
In TypeScript 2.4, it’s now an error to assign anything to a weak type when there’s no overlap in properties (see [here](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-4.html#weak-type-detection)).
Note that in the `MenuItem` interface all properties are marked as optional. Therefore, this is considered a "weak type".
* **Proposed solution:**
The `items` property in the `MenutItem` interface should be defined as follows:
```
items?: MenuItem[]|MenuItem[][];
```
|
1.0
|
MegaMenu doesn't compile with TypeScript 2.4 - **I'm submitting a ...**
```
[X] bug report
```
**Test case**
You can use the following demo app as test case:
https://github.com/ova2/angular-development-with-primeng/tree/master/chapter7/megamenu
**Current behavior**
If you run the showcase for the MegaMenu with TypeScript 2.4 or run the demo app linked above, you will get a compilation error like
```
Type '{ label: string; items: { label: string; }[]; }[]' has no properties in common with type 'MenuItem'.
```
**Expected behavior**
There should be no compilation error, as when compiling with TypeScript 2.3.
**Minimal reproduction of the problem with instructions**
* Install the above app
* or install the current master of PrimeNG and change the requirements in package.json to Angular 4.3, Angular-Cli 1.3, TypeScript 2.4 (lower Angular/Cli versions require TypeScript 2.3, so you need to test with Angular 4.3 and Cli 1.3)
* Run `npm` install and `npm start` and check the MegaMenu
* **Angular version:** 4.3.3
* **PrimeNG version:** 4.1.2
* **Browser:** all
* **Language:** TypeScript 2.4
* **Node (for AoT issues):** 8.1.4
* **Analysis of the problem:**
In the `MenuItem` interface, the `items` property is defined as of type `MenuItem[]`. But in the MegaMenu, you can have arrays of arrays of MenuItems as items, not just arrays of MenuItems.
In TypeScript 2.4, it’s now an error to assign anything to a weak type when there’s no overlap in properties (see [here](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-4.html#weak-type-detection)).
Note that in the `MenuItem` interface all properties are marked as optional. Therefore, this is considered a "weak type".
* **Proposed solution:**
The `items` property in the `MenutItem` interface should be defined as follows:
```
items?: MenuItem[]|MenuItem[][];
```
|
non_process
|
megamenu doesn t compile with typescript i m submitting a bug report test case you can use the following demo app as test case current behavior if you run the showcase for the megamenu with typescript or run the demo app linked above you will get a compilation error like type label string items label string has no properties in common with type menuitem expected behavior there should be no compilation error as when compiling with typescript minimal reproduction of the problem with instructions install the above app or install the current master of primeng and change the requirements in package json to angular angular cli typescript lower angular cli versions require typescript so you need to test with angular and cli run npm install and npm start and check the megamenu angular version primeng version browser all language typescript node for aot issues analysis of the problem in the menuitem interface the items property is defined as of type menuitem but in the megamenu you can have arrays of arrays of menuitems as items not just arrays of menuitems in typescript it’s now an error to assign anything to a weak type when there’s no overlap in properties see note that in the menuitem interface all properties are marked as optional therefore this is considered a weak type proposed solution the items property in the menutitem interface should be defined as follows items menuitem menuitem
| 0
|
1,499
| 4,075,958,399
|
IssuesEvent
|
2016-05-29 15:33:29
|
alexrj/Slic3r
|
https://api.github.com/repos/alexrj/Slic3r
|
closed
|
Issue: Max extrusions speed has preference over retract speed
|
Fixable with post-process script Not a bug
|
This happens at least with repetier firmware in slice3r 1.2.9
In eeprom I can set a maximum extrusion speed (something like 2mm/s on 3mm abs on my K8200)
Retraction speed is set at 40mm/s, but it will retract at no greater speed then what I set for max extrusion, so very very slow.
Automatic extrusion in repetier firmware is set to off, because that has another bug, it will not retract at all, just push, but, at the proper speed :-(
Reasonably speaking this is a bug of slic3r.
|
1.0
|
Issue: Max extrusions speed has preference over retract speed - This happens at least with repetier firmware in slice3r 1.2.9
In eeprom I can set a maximum extrusion speed (something like 2mm/s on 3mm abs on my K8200)
Retraction speed is set at 40mm/s, but it will retract at no greater speed then what I set for max extrusion, so very very slow.
Automatic extrusion in repetier firmware is set to off, because that has another bug, it will not retract at all, just push, but, at the proper speed :-(
Reasonably speaking this is a bug of slic3r.
|
process
|
issue max extrusions speed has preference over retract speed this happens at least with repetier firmware in in eeprom i can set a maximum extrusion speed something like s on abs on my retraction speed is set at s but it will retract at no greater speed then what i set for max extrusion so very very slow automatic extrusion in repetier firmware is set to off because that has another bug it will not retract at all just push but at the proper speed reasonably speaking this is a bug of
| 1
|
187,066
| 14,426,956,557
|
IssuesEvent
|
2020-12-06 01:00:40
|
kalexmills/github-vet-tests-dec2020
|
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
|
closed
|
giantswarm/kvm-operator-node-controller: vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go; 5 LoC
|
fresh test tiny vendored
|
Found a possible issue in [giantswarm/kvm-operator-node-controller](https://www.github.com/giantswarm/kvm-operator-node-controller) at [vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go](https://github.com/giantswarm/kvm-operator-node-controller/blob/7146561e54142d4f986daee0206336ebee3ceb18/vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go#L551-L555)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to node at line 552 may start a goroutine
[Click here to see the code in its original context.](https://github.com/giantswarm/kvm-operator-node-controller/blob/7146561e54142d4f986daee0206336ebee3ceb18/vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go#L551-L555)
<details>
<summary>Click here to show the 5 line(s) of Go which triggered the analyzer.</summary>
```go
for _, node := range nodeList.Items {
if nodeFunc(&node) {
nodeNames = append(nodeNames, node.Name)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 7146561e54142d4f986daee0206336ebee3ceb18
|
1.0
|
giantswarm/kvm-operator-node-controller: vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go; 5 LoC -
Found a possible issue in [giantswarm/kvm-operator-node-controller](https://www.github.com/giantswarm/kvm-operator-node-controller) at [vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go](https://github.com/giantswarm/kvm-operator-node-controller/blob/7146561e54142d4f986daee0206336ebee3ceb18/vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go#L551-L555)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to node at line 552 may start a goroutine
[Click here to see the code in its original context.](https://github.com/giantswarm/kvm-operator-node-controller/blob/7146561e54142d4f986daee0206336ebee3ceb18/vendor/k8s.io/kubernetes/plugin/pkg/scheduler/factory/factory_test.go#L551-L555)
<details>
<summary>Click here to show the 5 line(s) of Go which triggered the analyzer.</summary>
```go
for _, node := range nodeList.Items {
if nodeFunc(&node) {
nodeNames = append(nodeNames, node.Name)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 7146561e54142d4f986daee0206336ebee3ceb18
|
non_process
|
giantswarm kvm operator node controller vendor io kubernetes plugin pkg scheduler factory factory test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to node at line may start a goroutine click here to show the line s of go which triggered the analyzer go for node range nodelist items if nodefunc node nodenames append nodenames node name leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 0
|
238,267
| 26,087,070,396
|
IssuesEvent
|
2022-12-26 05:15:39
|
SmartBear/git-en-boite
|
https://api.github.com/repos/SmartBear/git-en-boite
|
closed
|
WS-2021-0638 (High) detected in mocha-10.0.0.tgz
|
wontfix security vulnerability
|
## WS-2021-0638 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-10.0.0.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-10.0.0.tgz">https://registry.npmjs.org/mocha/-/mocha-10.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- git-en-boite-smoke-tests-0.0.0.tgz (Root Library)
- :x: **mocha-10.0.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is regular Expression Denial of Service (ReDoS) vulnerability in mocha.
It allows cause a denial of service when stripping crafted invalid function definition from strs.
<p>Publish Date: 2021-09-18
<p>URL: <a href=https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea>WS-2021-0638</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-09-18</p>
<p>Fix Resolution: mocha - 10.1.0</p>
</p>
</details>
<p></p>
|
True
|
WS-2021-0638 (High) detected in mocha-10.0.0.tgz - ## WS-2021-0638 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-10.0.0.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-10.0.0.tgz">https://registry.npmjs.org/mocha/-/mocha-10.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- git-en-boite-smoke-tests-0.0.0.tgz (Root Library)
- :x: **mocha-10.0.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is regular Expression Denial of Service (ReDoS) vulnerability in mocha.
It allows cause a denial of service when stripping crafted invalid function definition from strs.
<p>Publish Date: 2021-09-18
<p>URL: <a href=https://github.com/mochajs/mocha/commit/61b4b9209c2c64b32c8d48b1761c3b9384d411ea>WS-2021-0638</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-09-18</p>
<p>Fix Resolution: mocha - 10.1.0</p>
</p>
</details>
<p></p>
|
non_process
|
ws high detected in mocha tgz ws high severity vulnerability vulnerable library mocha tgz simple flexible fun test framework library home page a href path to dependency file package json path to vulnerable library node modules mocha package json dependency hierarchy git en boite smoke tests tgz root library x mocha tgz vulnerable library found in base branch main vulnerability details there is regular expression denial of service redos vulnerability in mocha it allows cause a denial of service when stripping crafted invalid function definition from strs publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution mocha
| 0
|
256,123
| 22,039,925,926
|
IssuesEvent
|
2022-05-29 07:23:51
|
tgstation/tgstation
|
https://api.github.com/repos/tgstation/tgstation
|
closed
|
Custom say emote issues.
|
Test Merge Bug
|
Reporting client version: 514.1583
<!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID:
[183960](https://scrubby.melonmesa.com/round/183960)
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://sb.atlantaned.space/rounds ! The round id let's us look up valuable information and logs for the round the bug happened.)-->
## Testmerges:
- [The Humanening](https://www.github.com/tgstation/tgstation/pull/67298)
- [Optimizes Runechat](https://www.github.com/tgstation/tgstation/pull/65791)
- [Adds Cargorilla](https://www.github.com/tgstation/tgstation/pull/67003)
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
## Reproduction:
Custom say emotes force you to say "An interesting thing to say" in runechat if no second argument is given.
Example:
`screams in agony!*`
Tried reproducing this on local, and it didn't happen, so most definitely #65791 test merge bug.
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
|
1.0
|
Custom say emote issues. - Reporting client version: 514.1583
<!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID:
[183960](https://scrubby.melonmesa.com/round/183960)
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://sb.atlantaned.space/rounds ! The round id let's us look up valuable information and logs for the round the bug happened.)-->
## Testmerges:
- [The Humanening](https://www.github.com/tgstation/tgstation/pull/67298)
- [Optimizes Runechat](https://www.github.com/tgstation/tgstation/pull/65791)
- [Adds Cargorilla](https://www.github.com/tgstation/tgstation/pull/67003)
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
## Reproduction:
Custom say emotes force you to say "An interesting thing to say" in runechat if no second argument is given.
Example:
`screams in agony!*`
Tried reproducing this on local, and it didn't happen, so most definitely #65791 test merge bug.
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
|
non_process
|
custom say emote issues reporting client version round id include the round id if you discovered this issue from playing tgstation hosted servers it can be found in the status panel or retrieved from the round id let s us look up valuable information and logs for the round the bug happened testmerges reproduction custom say emotes force you to say an interesting thing to say in runechat if no second argument is given example screams in agony tried reproducing this on local and it didn t happen so most definitely test merge bug
| 0
|
108,966
| 23,688,765,789
|
IssuesEvent
|
2022-08-29 08:53:53
|
anegostudios/VintageStory-Issues
|
https://api.github.com/repos/anegostudios/VintageStory-Issues
|
closed
|
Host Rock Section Missing From Rock Entries in Handbook
|
status: confirmed department: code
|
**Game Version:** 1.17.0-rc5
**Platform:** Windows
**Modded:** No
**SP/MP:** Singleplayer
### Description
The host rock section is missing from the rock entries in the handbook. This section was present in 1.16.5.
### How to reproduce
Open the entry for any rock type.
### Expected behavior
Ores should be listed under "Host rock for" on the rock's page.
### Screenshots
**1.16.5**

**1.17.0-rc5**

### Logs
```
14.8.2022 13:35:28 [Notification] Client logger started.
14.8.2022 13:35:28 [Notification] Game Version: v1.17.0-rc.5 (Unstable)
14.8.2022 13:35:28 [Notification] Screens:
14.8.2022 13:35:28 [Notification] 0: {X=0,Y=0,Width=1920,Height=1080}, \\.\DISPLAY1 (primary)
14.8.2022 13:35:29 [Notification] OpenAL Initialized. Available Mono/Stereo Sources: 255/1
14.8.2022 13:35:29 [Notification] Graphics Card Vendor: NVIDIA Corporation
14.8.2022 13:35:29 [Notification] Graphics Card Version: 3.3.0 NVIDIA 512.15
14.8.2022 13:35:29 [Notification] Graphics Card Renderer: NVIDIA GeForce GTX 1050/PCIe/SSE2
14.8.2022 13:35:29 [Notification] Graphics Card ShadingLanguageVersion: 3.30 NVIDIA via Cg compiler
14.8.2022 13:35:29 [Notification] Cairo Graphics Version: 1.17.3
14.8.2022 13:35:29 [Notification] OpenAL Version: 1.1 ALSOFT 1.16.0
14.8.2022 13:35:29 [Notification] C# Framework: .net Framework 4.0.30319.42000
14.8.2022 13:35:29 [Notification] OpenTK Version: 3.3.2 (A set of fast, low-level C# bindings for OpenGL, OpenGL ES and OpenAL.)
14.8.2022 13:35:29 [Notification] Start discovering assets
14.8.2022 13:35:29 [Notification] Found 18 base assets in category lang
14.8.2022 13:35:29 [Notification] Found 0 base assets in category patches
14.8.2022 13:35:29 [Notification] Found 22 base assets in category config
14.8.2022 13:35:29 [Notification] Found 0 base assets in category worldproperties
14.8.2022 13:35:29 [Notification] Found 46 base assets in category sounds
14.8.2022 13:35:29 [Notification] Found 97 base assets in category shapes
14.8.2022 13:35:29 [Notification] Found 80 base assets in category shaders
14.8.2022 13:35:29 [Notification] Found 16 base assets in category shaderincludes
14.8.2022 13:35:29 [Notification] Found 128 base assets in category textures
14.8.2022 13:35:29 [Notification] Found 4 base assets in category music
14.8.2022 13:35:29 [Notification] Found 0 base assets in category dialog
14.8.2022 13:35:29 [Notification] Found 411 base assets in total
14.8.2022 13:35:29 [Notification] Loading sounds
14.8.2022 13:35:29 [Notification] Sounds loaded
14.8.2022 13:35:29 [Notification] (Re-)loaded frame buffers
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass .
14.8.2022 13:35:29 [Notification] CPU Cores: 12
14.8.2022 13:35:29 [Notification] Window was resized to 1920 1080, rebuilding framebuffers...
14.8.2022 13:35:29 [Notification] (Re-)loaded frame buffers
14.8.2022 13:35:29 [Notification] Begin loading shaders
14.8.2022 13:35:29 [Notification] Load shaders now
14.8.2022 13:35:29 [Notification] Loading shaders...
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass standard.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlescube.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlesquad.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass sky.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass nightsky.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass woittest.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass transparentcompose.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass debugdepthbuffer.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass helditem.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunkopaque.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunkliquid.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass decals.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass final.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass gui.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blur.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunktransparent.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass findbright.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunktopsoil.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass godrays.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass autocamera.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blockhighlights.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass wireframe.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass entityanimated.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass luma.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blit.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlesquad2d.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass shadowmapentityanimated.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass shadowmapgeneric.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass texture2texture.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass celestialobject.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass guitopsoil.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass colorgrade.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass guigear.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass ssao.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass bilateralblur.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass grass.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass flowers.
14.8.2022 13:35:30 [Notification] Loaded Shaderprogramm for render pass shadowgrass.
14.8.2022 13:35:30 [Notification] Loaded Shaderprogramm for render pass shadowflowers.
14.8.2022 13:35:30 [Notification] Cached session key is valid, validating with server
14.8.2022 13:35:30 [Notification] Server validation response: Good
14.8.2022 13:35:30 [Notification] Will search the following paths for mods:
14.8.2022 13:35:30 [Notification] C:\Users\Logan\AppData\Roaming\Vintagestory\Mods
14.8.2022 13:35:30 [Notification] C:\Users\Logan\AppData\Roaming\VintagestoryData\Mods
14.8.2022 13:35:35 [Notification] Initialized GUI Manager
14.8.2022 13:35:36 [Notification] Initialized Server Connection
14.8.2022 13:35:36 [Notification] Server args parsed
14.8.2022 13:35:36 [Notification] GuiScreenConnectingToServer constructed
14.8.2022 13:35:36 [Notification] Server main instantiated
14.8.2022 13:35:42 [Notification] Processed server identification
14.8.2022 13:35:42 [Notification] Map initialized
14.8.2022 13:35:42 [Notification] Received server assets
14.8.2022 13:35:42 [Notification] Loading and pre-starting client side mods...
14.8.2022 13:35:42 [Notification] Will search the following paths for mods:
14.8.2022 13:35:42 [Notification] C:\Users\Logan\AppData\Roaming\Vintagestory\Mods
14.8.2022 13:35:42 [Notification] C:\Users\Logan\AppData\Roaming\VintagestoryData\Mods
14.8.2022 13:35:42 [Notification] Found 40 mods (37 disabled)
14.8.2022 13:35:42 [Notification] Mods, sorted by dependency: game, creative, survival
14.8.2022 13:35:42 [Notification] Instantiated 77 mod systems from 3 enabled mods
14.8.2022 13:35:42 [Notification] Done loading and pre-starting client side mods.
14.8.2022 13:35:42 [Notification] External Origins in load order: modorigin@C:\Users\Logan\AppData\Roaming\Vintagestory\assets\creative\, modorigin@C:\Users\Logan\AppData\Roaming\Vintagestory\assets\survival\
14.8.2022 13:35:42 [Notification] Found 0 external assets in category lang
14.8.2022 13:35:42 [Notification] Found 8 external assets in category patches
14.8.2022 13:35:42 [Notification] Found 43 external assets in category config
14.8.2022 13:35:42 [Notification] Found 24 external assets in category worldproperties
14.8.2022 13:35:42 [Notification] Found 326 external assets in category sounds
14.8.2022 13:35:42 [Notification] Found 2678 external assets in category shapes
14.8.2022 13:35:42 [Notification] Found 13 external assets in category shaders
14.8.2022 13:35:42 [Notification] Found 0 external assets in category shaderincludes
14.8.2022 13:35:42 [Notification] Found 4477 external assets in category textures
14.8.2022 13:35:42 [Notification] Found 63 external assets in category music
14.8.2022 13:35:42 [Notification] Found 17 external assets in category dialog
14.8.2022 13:35:42 [Notification] Found 0 external assets in category compatibility
14.8.2022 13:35:42 [Notification] Reloaded lang file now with mod assets
14.8.2022 13:35:42 [Notification] JsonPatch Loader: Nothing to patch
14.8.2022 13:35:43 [Notification] Received 2408 item types from server
14.8.2022 13:35:43 [Notification] Loaded 8470 block types from server
14.8.2022 13:35:43 [Notification] Reloaded sounds, now with mod assets
14.8.2022 13:35:43 [Notification] Server launched
14.8.2022 13:35:44 [Notification] Composed 1 4096x4096 entities texture atlases from 184 textures (last textureid = 205)
14.8.2022 13:35:45 [Notification] Collected 1573 shapes to tesselate.
14.8.2022 13:35:46 [Notification] Composed 1 4096x4096 items texture atlases from 1232 textures (last textureid = 206)
14.8.2022 13:35:51 [Notification] Composed 1 4096x4096 blocks texture atlases from 3837 textures (last textureid = 207)
14.8.2022 13:35:52 [Notification] Server assets loaded
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass anvilworkitem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass lines.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass aurora.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass rift.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass machinegear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sleepoverlay.
14.8.2022 13:35:52 [Notification] Started 48 systems on Client:
14.8.2022 13:35:52 [Notification] Mod 'VSEssentials.dll' (game):
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.Core
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.EntityPartitioning
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ErrorReporter
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.ModCompatiblityUtil
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.NoObf.ModJsonPatchLoader
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.FallingBlockParticlesModSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.CharacterExtraDialogs
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.EntityNameTagRendererRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.POIRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.RoomRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WeatherSystemCommands
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WeatherSystemClient
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WorldMapManager
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.LoadColorMaps
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ClothManager
14.8.2022 13:35:52 [Notification] Mod 'VSSurvivalMod.dll' (survival):
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SaplingControl
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SurvivalCoreSystem
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenFromHeightmap
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenMaps
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.DebugSystem
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.UpgradeTasks
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.MyceliumSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemAuction
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemHandbook
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.LiquidItemStackRenderer
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.CharacterSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.FruitingSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemRifts
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemRiftWeather
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModTemperature
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SystemTemporalStability
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.MealMeshCache
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemWearableStats
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemBlockReinforcement
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ChiselBlockModelCache
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TemporalStabilityEffects
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModJournal
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModLootRandomizer
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSleeping
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TeleporterManager
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.Mechanics.MechanicalPowerMod
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TradeHandbookInfo
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenStructures
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenStructuresPosPass
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.RecipeRegistrySystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TraderOutfits
14.8.2022 13:35:52 [Notification] Mod 'VSCreativeMod.dll' (creative):
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.Core
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.WorldEdit.WorldEdit
14.8.2022 13:35:52 [Notification] Loading shaders...
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass standard.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlescube.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlesquad.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sky.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass nightsky.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass woittest.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass transparentcompose.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass debugdepthbuffer.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass helditem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunkopaque.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunkliquid.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass decals.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass final.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass gui.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blur.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunktransparent.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass findbright.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunktopsoil.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass godrays.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass autocamera.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blockhighlights.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass wireframe.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass entityanimated.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass luma.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blit.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlesquad2d.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowmapentityanimated.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowmapgeneric.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass texture2texture.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass celestialobject.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass guitopsoil.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass colorgrade.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass guigear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass ssao.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass bilateralblur.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass grass.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass flowers.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowgrass.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowflowers.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass anvilworkitem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass lines.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass aurora.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass rift.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass machinegear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sleepoverlay.
14.8.2022 13:35:52 [Notification] Reloaded shaders now with mod assets
14.8.2022 13:35:52 [Notification] Received level init
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass instanced.
14.8.2022 13:35:52 [Notification] Loading world map cache db...
14.8.2022 13:35:52 [Notification] Initialized Music Engine
14.8.2022 13:35:52 [Notification] Blocks tesselated
14.8.2022 13:35:53 [Notification] Texture size is 32 so decal atlas size of 128x128 should suffice
14.8.2022 13:35:54 [Notification] Received level finalize
14.8.2022 13:35:54 [Notification] Loaded Shaderprogramm for render pass clouds.
14.8.2022 13:36:06 [Notification] Finished fully loading sounds (async)
14.8.2022 13:36:36 [Notification] Wow, client daytime drifted off significantly from server daytime (21.4 mins)
14.8.2022 13:38:13 [Notification] Client pause state is now on
14.8.2022 13:38:14 [Notification] Destroying game session, waiting up to 200ms for client threads to exit
14.8.2022 13:38:14 [Notification] Stopping single player server
14.8.2022 13:38:15 [Notification] Exiting current game to main menu, reason: leave world button pressed
14.8.2022 13:38:15 [Notification] GuiScreenConnectingToServer constructed
```
|
1.0
|
Host Rock Section Missing From Rock Entries in Handbook - **Game Version:** 1.17.0-rc5
**Platform:** Windows
**Modded:** No
**SP/MP:** Singleplayer
### Description
The host rock section is missing from the rock entries in the handbook. This section was present in 1.16.5.
### How to reproduce
Open the entry for any rock type.
### Expected behavior
Ores should be listed under "Host rock for" on the rock's page.
### Screenshots
**1.16.5**

**1.17.0-rc5**

### Logs
```
14.8.2022 13:35:28 [Notification] Client logger started.
14.8.2022 13:35:28 [Notification] Game Version: v1.17.0-rc.5 (Unstable)
14.8.2022 13:35:28 [Notification] Screens:
14.8.2022 13:35:28 [Notification] 0: {X=0,Y=0,Width=1920,Height=1080}, \\.\DISPLAY1 (primary)
14.8.2022 13:35:29 [Notification] OpenAL Initialized. Available Mono/Stereo Sources: 255/1
14.8.2022 13:35:29 [Notification] Graphics Card Vendor: NVIDIA Corporation
14.8.2022 13:35:29 [Notification] Graphics Card Version: 3.3.0 NVIDIA 512.15
14.8.2022 13:35:29 [Notification] Graphics Card Renderer: NVIDIA GeForce GTX 1050/PCIe/SSE2
14.8.2022 13:35:29 [Notification] Graphics Card ShadingLanguageVersion: 3.30 NVIDIA via Cg compiler
14.8.2022 13:35:29 [Notification] Cairo Graphics Version: 1.17.3
14.8.2022 13:35:29 [Notification] OpenAL Version: 1.1 ALSOFT 1.16.0
14.8.2022 13:35:29 [Notification] C# Framework: .net Framework 4.0.30319.42000
14.8.2022 13:35:29 [Notification] OpenTK Version: 3.3.2 (A set of fast, low-level C# bindings for OpenGL, OpenGL ES and OpenAL.)
14.8.2022 13:35:29 [Notification] Start discovering assets
14.8.2022 13:35:29 [Notification] Found 18 base assets in category lang
14.8.2022 13:35:29 [Notification] Found 0 base assets in category patches
14.8.2022 13:35:29 [Notification] Found 22 base assets in category config
14.8.2022 13:35:29 [Notification] Found 0 base assets in category worldproperties
14.8.2022 13:35:29 [Notification] Found 46 base assets in category sounds
14.8.2022 13:35:29 [Notification] Found 97 base assets in category shapes
14.8.2022 13:35:29 [Notification] Found 80 base assets in category shaders
14.8.2022 13:35:29 [Notification] Found 16 base assets in category shaderincludes
14.8.2022 13:35:29 [Notification] Found 128 base assets in category textures
14.8.2022 13:35:29 [Notification] Found 4 base assets in category music
14.8.2022 13:35:29 [Notification] Found 0 base assets in category dialog
14.8.2022 13:35:29 [Notification] Found 411 base assets in total
14.8.2022 13:35:29 [Notification] Loading sounds
14.8.2022 13:35:29 [Notification] Sounds loaded
14.8.2022 13:35:29 [Notification] (Re-)loaded frame buffers
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass .
14.8.2022 13:35:29 [Notification] CPU Cores: 12
14.8.2022 13:35:29 [Notification] Window was resized to 1920 1080, rebuilding framebuffers...
14.8.2022 13:35:29 [Notification] (Re-)loaded frame buffers
14.8.2022 13:35:29 [Notification] Begin loading shaders
14.8.2022 13:35:29 [Notification] Load shaders now
14.8.2022 13:35:29 [Notification] Loading shaders...
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass standard.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlescube.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlesquad.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass sky.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass nightsky.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass woittest.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass transparentcompose.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass debugdepthbuffer.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass helditem.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunkopaque.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunkliquid.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass decals.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass final.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass gui.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blur.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunktransparent.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass findbright.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass chunktopsoil.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass godrays.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass autocamera.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blockhighlights.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass wireframe.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass entityanimated.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass luma.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass blit.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass particlesquad2d.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass shadowmapentityanimated.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass shadowmapgeneric.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass texture2texture.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass celestialobject.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass guitopsoil.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass colorgrade.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass guigear.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass ssao.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass bilateralblur.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass grass.
14.8.2022 13:35:29 [Notification] Loaded Shaderprogramm for render pass flowers.
14.8.2022 13:35:30 [Notification] Loaded Shaderprogramm for render pass shadowgrass.
14.8.2022 13:35:30 [Notification] Loaded Shaderprogramm for render pass shadowflowers.
14.8.2022 13:35:30 [Notification] Cached session key is valid, validating with server
14.8.2022 13:35:30 [Notification] Server validation response: Good
14.8.2022 13:35:30 [Notification] Will search the following paths for mods:
14.8.2022 13:35:30 [Notification] C:\Users\Logan\AppData\Roaming\Vintagestory\Mods
14.8.2022 13:35:30 [Notification] C:\Users\Logan\AppData\Roaming\VintagestoryData\Mods
14.8.2022 13:35:35 [Notification] Initialized GUI Manager
14.8.2022 13:35:36 [Notification] Initialized Server Connection
14.8.2022 13:35:36 [Notification] Server args parsed
14.8.2022 13:35:36 [Notification] GuiScreenConnectingToServer constructed
14.8.2022 13:35:36 [Notification] Server main instantiated
14.8.2022 13:35:42 [Notification] Processed server identification
14.8.2022 13:35:42 [Notification] Map initialized
14.8.2022 13:35:42 [Notification] Received server assets
14.8.2022 13:35:42 [Notification] Loading and pre-starting client side mods...
14.8.2022 13:35:42 [Notification] Will search the following paths for mods:
14.8.2022 13:35:42 [Notification] C:\Users\Logan\AppData\Roaming\Vintagestory\Mods
14.8.2022 13:35:42 [Notification] C:\Users\Logan\AppData\Roaming\VintagestoryData\Mods
14.8.2022 13:35:42 [Notification] Found 40 mods (37 disabled)
14.8.2022 13:35:42 [Notification] Mods, sorted by dependency: game, creative, survival
14.8.2022 13:35:42 [Notification] Instantiated 77 mod systems from 3 enabled mods
14.8.2022 13:35:42 [Notification] Done loading and pre-starting client side mods.
14.8.2022 13:35:42 [Notification] External Origins in load order: modorigin@C:\Users\Logan\AppData\Roaming\Vintagestory\assets\creative\, modorigin@C:\Users\Logan\AppData\Roaming\Vintagestory\assets\survival\
14.8.2022 13:35:42 [Notification] Found 0 external assets in category lang
14.8.2022 13:35:42 [Notification] Found 8 external assets in category patches
14.8.2022 13:35:42 [Notification] Found 43 external assets in category config
14.8.2022 13:35:42 [Notification] Found 24 external assets in category worldproperties
14.8.2022 13:35:42 [Notification] Found 326 external assets in category sounds
14.8.2022 13:35:42 [Notification] Found 2678 external assets in category shapes
14.8.2022 13:35:42 [Notification] Found 13 external assets in category shaders
14.8.2022 13:35:42 [Notification] Found 0 external assets in category shaderincludes
14.8.2022 13:35:42 [Notification] Found 4477 external assets in category textures
14.8.2022 13:35:42 [Notification] Found 63 external assets in category music
14.8.2022 13:35:42 [Notification] Found 17 external assets in category dialog
14.8.2022 13:35:42 [Notification] Found 0 external assets in category compatibility
14.8.2022 13:35:42 [Notification] Reloaded lang file now with mod assets
14.8.2022 13:35:42 [Notification] JsonPatch Loader: Nothing to patch
14.8.2022 13:35:43 [Notification] Received 2408 item types from server
14.8.2022 13:35:43 [Notification] Loaded 8470 block types from server
14.8.2022 13:35:43 [Notification] Reloaded sounds, now with mod assets
14.8.2022 13:35:43 [Notification] Server launched
14.8.2022 13:35:44 [Notification] Composed 1 4096x4096 entities texture atlases from 184 textures (last textureid = 205)
14.8.2022 13:35:45 [Notification] Collected 1573 shapes to tesselate.
14.8.2022 13:35:46 [Notification] Composed 1 4096x4096 items texture atlases from 1232 textures (last textureid = 206)
14.8.2022 13:35:51 [Notification] Composed 1 4096x4096 blocks texture atlases from 3837 textures (last textureid = 207)
14.8.2022 13:35:52 [Notification] Server assets loaded
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass anvilworkitem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass lines.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass aurora.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass rift.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass machinegear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sleepoverlay.
14.8.2022 13:35:52 [Notification] Started 48 systems on Client:
14.8.2022 13:35:52 [Notification] Mod 'VSEssentials.dll' (game):
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.Core
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.EntityPartitioning
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ErrorReporter
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.ModCompatiblityUtil
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.NoObf.ModJsonPatchLoader
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.FallingBlockParticlesModSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.CharacterExtraDialogs
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.EntityNameTagRendererRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.POIRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.RoomRegistry
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WeatherSystemCommands
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WeatherSystemClient
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.WorldMapManager
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.LoadColorMaps
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ClothManager
14.8.2022 13:35:52 [Notification] Mod 'VSSurvivalMod.dll' (survival):
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SaplingControl
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SurvivalCoreSystem
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenFromHeightmap
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenMaps
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.DebugSystem
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.UpgradeTasks
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.MyceliumSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemAuction
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemHandbook
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.LiquidItemStackRenderer
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.CharacterSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.FruitingSystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemRifts
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemRiftWeather
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModTemperature
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.SystemTemporalStability
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.MealMeshCache
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemWearableStats
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSystemBlockReinforcement
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ChiselBlockModelCache
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TemporalStabilityEffects
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModJournal
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModLootRandomizer
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.ModSleeping
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TeleporterManager
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.Mechanics.MechanicalPowerMod
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TradeHandbookInfo
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenStructures
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.GenStructuresPosPass
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.RecipeRegistrySystem
14.8.2022 13:35:52 [Notification] Vintagestory.GameContent.TraderOutfits
14.8.2022 13:35:52 [Notification] Mod 'VSCreativeMod.dll' (creative):
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.Core
14.8.2022 13:35:52 [Notification] Vintagestory.ServerMods.WorldEdit.WorldEdit
14.8.2022 13:35:52 [Notification] Loading shaders...
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass standard.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlescube.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlesquad.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sky.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass nightsky.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass woittest.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass transparentcompose.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass debugdepthbuffer.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass helditem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunkopaque.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunkliquid.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass decals.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass final.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass gui.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blur.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunktransparent.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass findbright.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass chunktopsoil.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass godrays.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass autocamera.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blockhighlights.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass wireframe.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass entityanimated.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass luma.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass blit.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass particlesquad2d.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowmapentityanimated.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowmapgeneric.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass texture2texture.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass celestialobject.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass guitopsoil.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass colorgrade.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass guigear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass ssao.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass bilateralblur.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass grass.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass flowers.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowgrass.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass shadowflowers.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass anvilworkitem.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass lines.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass aurora.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass rift.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass machinegear.
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass sleepoverlay.
14.8.2022 13:35:52 [Notification] Reloaded shaders now with mod assets
14.8.2022 13:35:52 [Notification] Received level init
14.8.2022 13:35:52 [Notification] Loaded Shaderprogramm for render pass instanced.
14.8.2022 13:35:52 [Notification] Loading world map cache db...
14.8.2022 13:35:52 [Notification] Initialized Music Engine
14.8.2022 13:35:52 [Notification] Blocks tesselated
14.8.2022 13:35:53 [Notification] Texture size is 32 so decal atlas size of 128x128 should suffice
14.8.2022 13:35:54 [Notification] Received level finalize
14.8.2022 13:35:54 [Notification] Loaded Shaderprogramm for render pass clouds.
14.8.2022 13:36:06 [Notification] Finished fully loading sounds (async)
14.8.2022 13:36:36 [Notification] Wow, client daytime drifted off significantly from server daytime (21.4 mins)
14.8.2022 13:38:13 [Notification] Client pause state is now on
14.8.2022 13:38:14 [Notification] Destroying game session, waiting up to 200ms for client threads to exit
14.8.2022 13:38:14 [Notification] Stopping single player server
14.8.2022 13:38:15 [Notification] Exiting current game to main menu, reason: leave world button pressed
14.8.2022 13:38:15 [Notification] GuiScreenConnectingToServer constructed
```
|
non_process
|
host rock section missing from rock entries in handbook game version platform windows modded no sp mp singleplayer description the host rock section is missing from the rock entries in the handbook this section was present in how to reproduce open the entry for any rock type expected behavior ores should be listed under host rock for on the rock s page screenshots logs client logger started game version rc unstable screens x y width height primary openal initialized available mono stereo sources graphics card vendor nvidia corporation graphics card version nvidia graphics card renderer nvidia geforce gtx pcie graphics card shadinglanguageversion nvidia via cg compiler cairo graphics version openal version alsoft c framework net framework opentk version a set of fast low level c bindings for opengl opengl es and openal start discovering assets found base assets in category lang found base assets in category patches found base assets in category config found base assets in category worldproperties found base assets in category sounds found base assets in category shapes found base assets in category shaders found base assets in category shaderincludes found base assets in category textures found base assets in category music found base assets in category dialog found base assets in total loading sounds sounds loaded re loaded frame buffers loaded shaderprogramm for render pass cpu cores window was resized to rebuilding framebuffers re loaded frame buffers begin loading shaders load shaders now loading shaders loaded shaderprogramm for render pass standard loaded shaderprogramm for render pass particlescube loaded shaderprogramm for render pass particlesquad loaded shaderprogramm for render pass sky loaded shaderprogramm for render pass nightsky loaded shaderprogramm for render pass woittest loaded shaderprogramm for render pass transparentcompose loaded shaderprogramm for render pass debugdepthbuffer loaded shaderprogramm for render pass helditem loaded shaderprogramm for render pass chunkopaque loaded shaderprogramm for render pass chunkliquid loaded shaderprogramm for render pass decals loaded shaderprogramm for render pass final loaded shaderprogramm for render pass gui loaded shaderprogramm for render pass blur loaded shaderprogramm for render pass chunktransparent loaded shaderprogramm for render pass findbright loaded shaderprogramm for render pass chunktopsoil loaded shaderprogramm for render pass godrays loaded shaderprogramm for render pass autocamera loaded shaderprogramm for render pass blockhighlights loaded shaderprogramm for render pass wireframe loaded shaderprogramm for render pass entityanimated loaded shaderprogramm for render pass luma loaded shaderprogramm for render pass blit loaded shaderprogramm for render pass loaded shaderprogramm for render pass shadowmapentityanimated loaded shaderprogramm for render pass shadowmapgeneric loaded shaderprogramm for render pass loaded shaderprogramm for render pass celestialobject loaded shaderprogramm for render pass guitopsoil loaded shaderprogramm for render pass colorgrade loaded shaderprogramm for render pass guigear loaded shaderprogramm for render pass ssao loaded shaderprogramm for render pass bilateralblur loaded shaderprogramm for render pass grass loaded shaderprogramm for render pass flowers loaded shaderprogramm for render pass shadowgrass loaded shaderprogramm for render pass shadowflowers cached session key is valid validating with server server validation response good will search the following paths for mods c users logan appdata roaming vintagestory mods c users logan appdata roaming vintagestorydata mods initialized gui manager initialized server connection server args parsed guiscreenconnectingtoserver constructed server main instantiated processed server identification map initialized received server assets loading and pre starting client side mods will search the following paths for mods c users logan appdata roaming vintagestory mods c users logan appdata roaming vintagestorydata mods found mods disabled mods sorted by dependency game creative survival instantiated mod systems from enabled mods done loading and pre starting client side mods external origins in load order modorigin c users logan appdata roaming vintagestory assets creative modorigin c users logan appdata roaming vintagestory assets survival found external assets in category lang found external assets in category patches found external assets in category config found external assets in category worldproperties found external assets in category sounds found external assets in category shapes found external assets in category shaders found external assets in category shaderincludes found external assets in category textures found external assets in category music found external assets in category dialog found external assets in category compatibility reloaded lang file now with mod assets jsonpatch loader nothing to patch received item types from server loaded block types from server reloaded sounds now with mod assets server launched composed entities texture atlases from textures last textureid collected shapes to tesselate composed items texture atlases from textures last textureid composed blocks texture atlases from textures last textureid server assets loaded loaded shaderprogramm for render pass anvilworkitem loaded shaderprogramm for render pass lines loaded shaderprogramm for render pass aurora loaded shaderprogramm for render pass rift loaded shaderprogramm for render pass machinegear loaded shaderprogramm for render pass sleepoverlay started systems on client mod vsessentials dll game vintagestory servermods core vintagestory gamecontent entitypartitioning vintagestory gamecontent errorreporter vintagestory servermods modcompatiblityutil vintagestory servermods noobf modjsonpatchloader vintagestory gamecontent fallingblockparticlesmodsystem vintagestory gamecontent characterextradialogs vintagestory gamecontent entitynametagrendererregistry vintagestory gamecontent poiregistry vintagestory gamecontent roomregistry vintagestory gamecontent weathersystemcommands vintagestory gamecontent weathersystemclient vintagestory gamecontent worldmapmanager vintagestory servermods loadcolormaps vintagestory gamecontent clothmanager mod vssurvivalmod dll survival vintagestory gamecontent saplingcontrol vintagestory gamecontent survivalcoresystem vintagestory servermods genfromheightmap vintagestory servermods genmaps vintagestory servermods debugsystem vintagestory servermods upgradetasks vintagestory gamecontent myceliumsystem vintagestory gamecontent modsystemauction vintagestory gamecontent modsystemhandbook vintagestory gamecontent liquiditemstackrenderer vintagestory gamecontent charactersystem vintagestory gamecontent fruitingsystem vintagestory gamecontent modsystemrifts vintagestory gamecontent modsystemriftweather vintagestory gamecontent modtemperature vintagestory gamecontent systemtemporalstability vintagestory gamecontent mealmeshcache vintagestory gamecontent modsystemwearablestats vintagestory gamecontent modsystemblockreinforcement vintagestory gamecontent chiselblockmodelcache vintagestory gamecontent temporalstabilityeffects vintagestory gamecontent modjournal vintagestory gamecontent modlootrandomizer vintagestory gamecontent modsleeping vintagestory gamecontent teleportermanager vintagestory gamecontent mechanics mechanicalpowermod vintagestory gamecontent tradehandbookinfo vintagestory servermods genstructures vintagestory servermods genstructurespospass vintagestory gamecontent reciperegistrysystem vintagestory gamecontent traderoutfits mod vscreativemod dll creative vintagestory servermods core vintagestory servermods worldedit worldedit loading shaders loaded shaderprogramm for render pass standard loaded shaderprogramm for render pass particlescube loaded shaderprogramm for render pass particlesquad loaded shaderprogramm for render pass sky loaded shaderprogramm for render pass nightsky loaded shaderprogramm for render pass woittest loaded shaderprogramm for render pass transparentcompose loaded shaderprogramm for render pass debugdepthbuffer loaded shaderprogramm for render pass helditem loaded shaderprogramm for render pass chunkopaque loaded shaderprogramm for render pass chunkliquid loaded shaderprogramm for render pass decals loaded shaderprogramm for render pass final loaded shaderprogramm for render pass gui loaded shaderprogramm for render pass blur loaded shaderprogramm for render pass chunktransparent loaded shaderprogramm for render pass findbright loaded shaderprogramm for render pass chunktopsoil loaded shaderprogramm for render pass godrays loaded shaderprogramm for render pass autocamera loaded shaderprogramm for render pass blockhighlights loaded shaderprogramm for render pass wireframe loaded shaderprogramm for render pass entityanimated loaded shaderprogramm for render pass luma loaded shaderprogramm for render pass blit loaded shaderprogramm for render pass loaded shaderprogramm for render pass shadowmapentityanimated loaded shaderprogramm for render pass shadowmapgeneric loaded shaderprogramm for render pass loaded shaderprogramm for render pass celestialobject loaded shaderprogramm for render pass guitopsoil loaded shaderprogramm for render pass colorgrade loaded shaderprogramm for render pass guigear loaded shaderprogramm for render pass ssao loaded shaderprogramm for render pass bilateralblur loaded shaderprogramm for render pass grass loaded shaderprogramm for render pass flowers loaded shaderprogramm for render pass shadowgrass loaded shaderprogramm for render pass shadowflowers loaded shaderprogramm for render pass anvilworkitem loaded shaderprogramm for render pass lines loaded shaderprogramm for render pass aurora loaded shaderprogramm for render pass rift loaded shaderprogramm for render pass machinegear loaded shaderprogramm for render pass sleepoverlay reloaded shaders now with mod assets received level init loaded shaderprogramm for render pass instanced loading world map cache db initialized music engine blocks tesselated texture size is so decal atlas size of should suffice received level finalize loaded shaderprogramm for render pass clouds finished fully loading sounds async wow client daytime drifted off significantly from server daytime mins client pause state is now on destroying game session waiting up to for client threads to exit stopping single player server exiting current game to main menu reason leave world button pressed guiscreenconnectingtoserver constructed
| 0
|
3,000
| 2,789,896,580
|
IssuesEvent
|
2015-05-08 22:15:10
|
joomla/joomla-cms
|
https://api.github.com/repos/joomla/joomla-cms
|
closed
|
Menu item redirecting wrongly
|
No Code Attached Yet
|
#### Steps to reproduce the issue
1. create an article for registered users only,
2. create a menu item to aforementioned article,
3. go to the front-end of the site and log in,
4. access the article through the menu item,
5. log out,
6. click menu item.
#### Expected result
Get error explaining I'm not authorized to access the article since I'm not logged in.
#### Actual result
Get redirected to the site's homepage.
#### System Information
Database: 5.5.40-0ubuntu0.14.04.1
PHP: 5.5.9-1ubuntu4.4
Joomla version: Joomla! 3.3.6 Stable
Joomla platform version: Joomla Platform 13.1.0 Stable
User Agent: Mozilla/5.0 Firefox/33.0
#### Additional Information
Make sure to use an incognito/private browsing window to ensure that neither cookies nor the cache will affect the testing.
If you try accessing the article without ever having logged in, you'll get what was described in the "Expected result" section, so it's the logging in that's causing this behaviour after logging out.
|
1.0
|
Menu item redirecting wrongly - #### Steps to reproduce the issue
1. create an article for registered users only,
2. create a menu item to aforementioned article,
3. go to the front-end of the site and log in,
4. access the article through the menu item,
5. log out,
6. click menu item.
#### Expected result
Get error explaining I'm not authorized to access the article since I'm not logged in.
#### Actual result
Get redirected to the site's homepage.
#### System Information
Database: 5.5.40-0ubuntu0.14.04.1
PHP: 5.5.9-1ubuntu4.4
Joomla version: Joomla! 3.3.6 Stable
Joomla platform version: Joomla Platform 13.1.0 Stable
User Agent: Mozilla/5.0 Firefox/33.0
#### Additional Information
Make sure to use an incognito/private browsing window to ensure that neither cookies nor the cache will affect the testing.
If you try accessing the article without ever having logged in, you'll get what was described in the "Expected result" section, so it's the logging in that's causing this behaviour after logging out.
|
non_process
|
menu item redirecting wrongly steps to reproduce the issue create an article for registered users only create a menu item to aforementioned article go to the front end of the site and log in access the article through the menu item log out click menu item expected result get error explaining i m not authorized to access the article since i m not logged in actual result get redirected to the site s homepage system information database php joomla version joomla stable joomla platform version joomla platform stable user agent mozilla firefox additional information make sure to use an incognito private browsing window to ensure that neither cookies nor the cache will affect the testing if you try accessing the article without ever having logged in you ll get what was described in the expected result section so it s the logging in that s causing this behaviour after logging out
| 0
|
15,126
| 18,869,576,142
|
IssuesEvent
|
2021-11-13 00:46:35
|
RobertCraigie/prisma-client-py
|
https://api.github.com/repos/RobertCraigie/prisma-client-py
|
closed
|
Add support for the Bytes type
|
topic: types kind/feature process/candidate
|
## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Prisma supports a `Bytes` data type, we should support it too.
[https://www.prisma.io/docs/reference/api-reference/prisma-schema-reference#bytes](https://www.prisma.io/docs/reference/api-reference/prisma-schema-reference#bytes)
## Suggested Solution
Data of this type must be valid Base64, as pydantic does not support this yet (https://github.com/samuelcolvin/pydantic/issues/692) we will have to write our own wrapper type over this.
Goals
- Minimise redundant transformations from `bytes` to `str`
- Expose both a `str` interface and a `bytes` interface
Suggested interface:
```py
class Base64:
value: bytes
def __init__(self, value: bytes) -> None:
...
@classmethod
def encode(cls, value: bytes) -> 'Base64':
...
def decode(self) -> str:
...
def decode_bytes(self) -> bytes:
...
```
|
1.0
|
Add support for the Bytes type - ## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Prisma supports a `Bytes` data type, we should support it too.
[https://www.prisma.io/docs/reference/api-reference/prisma-schema-reference#bytes](https://www.prisma.io/docs/reference/api-reference/prisma-schema-reference#bytes)
## Suggested Solution
Data of this type must be valid Base64, as pydantic does not support this yet (https://github.com/samuelcolvin/pydantic/issues/692) we will have to write our own wrapper type over this.
Goals
- Minimise redundant transformations from `bytes` to `str`
- Expose both a `str` interface and a `bytes` interface
Suggested interface:
```py
class Base64:
value: bytes
def __init__(self, value: bytes) -> None:
...
@classmethod
def encode(cls, value: bytes) -> 'Base64':
...
def decode(self) -> str:
...
def decode_bytes(self) -> bytes:
...
```
|
process
|
add support for the bytes type problem prisma supports a bytes data type we should support it too suggested solution data of this type must be valid as pydantic does not support this yet we will have to write our own wrapper type over this goals minimise redundant transformations from bytes to str expose both a str interface and a bytes interface suggested interface py class value bytes def init self value bytes none classmethod def encode cls value bytes def decode self str def decode bytes self bytes
| 1
|
184,363
| 31,863,596,027
|
IssuesEvent
|
2023-09-15 12:43:50
|
readthedocs/readthedocs.org
|
https://api.github.com/repos/readthedocs/readthedocs.org
|
closed
|
Put anchor when searching for inactive versions
|
Improvement Needed: design decision
|
After filtering inactive versions you have to scroll down all active versions to find the search result. A simple anchor should fix this.
We are introducing one here https://github.com/readthedocs/readthedocs.org/pull/6276/files, so we could just make the change in that PR
|
1.0
|
Put anchor when searching for inactive versions - After filtering inactive versions you have to scroll down all active versions to find the search result. A simple anchor should fix this.
We are introducing one here https://github.com/readthedocs/readthedocs.org/pull/6276/files, so we could just make the change in that PR
|
non_process
|
put anchor when searching for inactive versions after filtering inactive versions you have to scroll down all active versions to find the search result a simple anchor should fix this we are introducing one here so we could just make the change in that pr
| 0
|
89,127
| 10,589,055,002
|
IssuesEvent
|
2019-10-09 04:34:45
|
input-output-hk/shelley-testnet
|
https://api.github.com/repos/input-output-hk/shelley-testnet
|
closed
|
Let non-technical users know that .sh is a file extension; and explain how to give it executable permissions
|
documentation
|
**Describe the bug**
Issue 7.
9. Save it as a script (.sh) and give it executable permissions.
It would be good to let CP know that .sh is a file extension. Also explain how to give it executable permissions. We don't know what give it executable permissions means and if this process differs by OS (I guess it does).
**To Reproduce**
Steps to reproduce the behavior:
1. Go to the page
https://testnet.iohkdev.io/cardano/shelley/get-started/setting-up-the-self-node/
2. find 9. Save it as a script (.sh) and give it executable permissions
**Expected behavior**
let CP know that .sh is a file extension. Also explain how to give it executable permissions; and ensure that those instructions are available for all 3 supported OS.
**Additional context**
Please add any other context about the problem here.
|
1.0
|
Let non-technical users know that .sh is a file extension; and explain how to give it executable permissions - **Describe the bug**
Issue 7.
9. Save it as a script (.sh) and give it executable permissions.
It would be good to let CP know that .sh is a file extension. Also explain how to give it executable permissions. We don't know what give it executable permissions means and if this process differs by OS (I guess it does).
**To Reproduce**
Steps to reproduce the behavior:
1. Go to the page
https://testnet.iohkdev.io/cardano/shelley/get-started/setting-up-the-self-node/
2. find 9. Save it as a script (.sh) and give it executable permissions
**Expected behavior**
let CP know that .sh is a file extension. Also explain how to give it executable permissions; and ensure that those instructions are available for all 3 supported OS.
**Additional context**
Please add any other context about the problem here.
|
non_process
|
let non technical users know that sh is a file extension and explain how to give it executable permissions describe the bug issue save it as a script sh and give it executable permissions it would be good to let cp know that sh is a file extension also explain how to give it executable permissions we don t know what give it executable permissions means and if this process differs by os i guess it does to reproduce steps to reproduce the behavior go to the page find save it as a script sh and give it executable permissions expected behavior let cp know that sh is a file extension also explain how to give it executable permissions and ensure that those instructions are available for all supported os additional context please add any other context about the problem here
| 0
|
6,147
| 9,014,837,641
|
IssuesEvent
|
2019-02-05 23:51:42
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Missing process functions under workers should throw explicit error
|
feature request process worker
|
<!--
Thank you for suggesting an idea to make Node.js better.
Please fill in as much of the template below as you're able.
-->
**Is your feature request related to a problem? Please describe.**
Jest calls [`mkdirp`](https://npmjs.com/package/mkdirp) during its unit tests. Jest has also recently added support for `worker_threads`. However, when I wanted to try it out, hundreds of tests failed with `TypeError: process.umask is not a function`
It turn out `mkdirp` uses `process.umask()` if no `mode` is provided: https://github.com/substack/node-mkdirp/blob/f2003bbcffa80f8c9744579fabab1212fc84545a/index.js#L64
I've since found that this function missing in `worker_threads` is well documented: https://nodejs.org/api/worker_threads.html#worker_threads_class_worker
>`process.chdir()` and `process` methods that set group or user ids are not available.
**Describe the solution you'd like**
If node could throw "`process.umask` is not available in worker threads" or similar it would have made my debugging way easier, rather than the function just be missing.
I'm not sure how that would play with `typeof process.umask === 'function'` checks people might have?
**Describe alternatives you've considered**
My solution was to pass `777` as mode explicitly to `mkdirp`, but a clearer error would have made the debugging way easier.
My use case might be a bit special since Jest reconstructs a fake `process` object for every single test (by inspecting the real one), so my rabbit hole before checking worker docs were probably deeper than most people in the same situation. I was also testing node 12 (where threads are unflagged), so I wasn't even aware I was running in threads at first.
|
1.0
|
Missing process functions under workers should throw explicit error - <!--
Thank you for suggesting an idea to make Node.js better.
Please fill in as much of the template below as you're able.
-->
**Is your feature request related to a problem? Please describe.**
Jest calls [`mkdirp`](https://npmjs.com/package/mkdirp) during its unit tests. Jest has also recently added support for `worker_threads`. However, when I wanted to try it out, hundreds of tests failed with `TypeError: process.umask is not a function`
It turn out `mkdirp` uses `process.umask()` if no `mode` is provided: https://github.com/substack/node-mkdirp/blob/f2003bbcffa80f8c9744579fabab1212fc84545a/index.js#L64
I've since found that this function missing in `worker_threads` is well documented: https://nodejs.org/api/worker_threads.html#worker_threads_class_worker
>`process.chdir()` and `process` methods that set group or user ids are not available.
**Describe the solution you'd like**
If node could throw "`process.umask` is not available in worker threads" or similar it would have made my debugging way easier, rather than the function just be missing.
I'm not sure how that would play with `typeof process.umask === 'function'` checks people might have?
**Describe alternatives you've considered**
My solution was to pass `777` as mode explicitly to `mkdirp`, but a clearer error would have made the debugging way easier.
My use case might be a bit special since Jest reconstructs a fake `process` object for every single test (by inspecting the real one), so my rabbit hole before checking worker docs were probably deeper than most people in the same situation. I was also testing node 12 (where threads are unflagged), so I wasn't even aware I was running in threads at first.
|
process
|
missing process functions under workers should throw explicit error thank you for suggesting an idea to make node js better please fill in as much of the template below as you re able is your feature request related to a problem please describe jest calls during its unit tests jest has also recently added support for worker threads however when i wanted to try it out hundreds of tests failed with typeerror process umask is not a function it turn out mkdirp uses process umask if no mode is provided i ve since found that this function missing in worker threads is well documented process chdir and process methods that set group or user ids are not available describe the solution you d like if node could throw process umask is not available in worker threads or similar it would have made my debugging way easier rather than the function just be missing i m not sure how that would play with typeof process umask function checks people might have describe alternatives you ve considered my solution was to pass as mode explicitly to mkdirp but a clearer error would have made the debugging way easier my use case might be a bit special since jest reconstructs a fake process object for every single test by inspecting the real one so my rabbit hole before checking worker docs were probably deeper than most people in the same situation i was also testing node where threads are unflagged so i wasn t even aware i was running in threads at first
| 1
|
4,466
| 7,332,697,873
|
IssuesEvent
|
2018-03-05 17:02:28
|
cedardevs/psi
|
https://api.github.com/repos/cedardevs/psi
|
closed
|
Improve error handling in script-wrapper
|
EPIC: PSI DSCOVR datastream psi-processor ready
|
We dont want to output an exception to the output topic.
|
1.0
|
Improve error handling in script-wrapper - We dont want to output an exception to the output topic.
|
process
|
improve error handling in script wrapper we dont want to output an exception to the output topic
| 1
|
3,378
| 5,798,727,289
|
IssuesEvent
|
2017-05-03 03:23:50
|
SChAth/dillmann
|
https://api.github.com/repos/SChAth/dillmann
|
closed
|
homophones
|
enhancement requirement
|
@AndreasEllwardt
es kann nur in die normale Suche passieren, ich habe die Art Name geändert. Alle (und nur) die alternativen in die liste sind jetzt in dieses Suche art möglich.
einfach testen:
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሀ
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሠ
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሰ
zu beachten, im Moment ሀ ሃ ሐ ሓ ኀ ኃ ist nur innerhalb des selbstes Ordnung gültig, also ሀ sucht auch für ሐ oder ኀ, nicht für ሃ etc. gleicherweise ሃ sucht auch für ሓ und ኃ und nicht für ሀ etc. innerhalb die selbste Ordnung als varianten suchen könnte falsche Ergebnisse geben, mehrere und mehr spezifische hinweise könnten einfach hinzugefügt werden.
|
1.0
|
homophones - @AndreasEllwardt
es kann nur in die normale Suche passieren, ich habe die Art Name geändert. Alle (und nur) die alternativen in die liste sind jetzt in dieses Suche art möglich.
einfach testen:
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሀ
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሠ
http://betamasaheft.aai.uni-hamburg.de/Dillmann/index.html?q=ሰ
zu beachten, im Moment ሀ ሃ ሐ ሓ ኀ ኃ ist nur innerhalb des selbstes Ordnung gültig, also ሀ sucht auch für ሐ oder ኀ, nicht für ሃ etc. gleicherweise ሃ sucht auch für ሓ und ኃ und nicht für ሀ etc. innerhalb die selbste Ordnung als varianten suchen könnte falsche Ergebnisse geben, mehrere und mehr spezifische hinweise könnten einfach hinzugefügt werden.
|
non_process
|
homophones andreasellwardt es kann nur in die normale suche passieren ich habe die art name geändert alle und nur die alternativen in die liste sind jetzt in dieses suche art möglich einfach testen zu beachten im moment ሀ ሃ ሐ ሓ ኀ ኃ ist nur innerhalb des selbstes ordnung gültig also ሀ sucht auch für ሐ oder ኀ nicht für ሃ etc gleicherweise ሃ sucht auch für ሓ und ኃ und nicht für ሀ etc innerhalb die selbste ordnung als varianten suchen könnte falsche ergebnisse geben mehrere und mehr spezifische hinweise könnten einfach hinzugefügt werden
| 0
|
14,591
| 17,703,532,329
|
IssuesEvent
|
2021-08-25 03:13:26
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
closed
|
New term - identifiedByID
|
Term - add Class - Identification normative Process - complete
|
[Edited, accommodating feedback below]
## New Term
Submitter: Tim Robertson (on behalf of @dshorthouse)
Justification: There is no way to identify individuals by e.g. ORCIDs
Proponents: GBIF (already in production), CETAF, DiSCCO
Definition: A list (concatenated and separated) of the globally unique identifiers for the person, people, groups, or organizations responsible for assigning the Taxon to the subject.
Comment: Recommended best practice is to provide a single identifier that disambiguates the details of the identifying agent. If a list is used, the order of the identifiers on the list should not be assumed to convey any semantics. Recommended best practice is to separate the values in a list with space vertical bar space ( | ).
Examples: `https://orcid.org/0000-0002-1825-0097` (for an individual), `https://orcid.org/0000-0002-1825-0097 | https://orcid.org/0000-0002-1825-0098` (for a list of people).
Refines: None
Replaces: None
ABCD 2.06: not in ABCD
|
1.0
|
New term - identifiedByID - [Edited, accommodating feedback below]
## New Term
Submitter: Tim Robertson (on behalf of @dshorthouse)
Justification: There is no way to identify individuals by e.g. ORCIDs
Proponents: GBIF (already in production), CETAF, DiSCCO
Definition: A list (concatenated and separated) of the globally unique identifiers for the person, people, groups, or organizations responsible for assigning the Taxon to the subject.
Comment: Recommended best practice is to provide a single identifier that disambiguates the details of the identifying agent. If a list is used, the order of the identifiers on the list should not be assumed to convey any semantics. Recommended best practice is to separate the values in a list with space vertical bar space ( | ).
Examples: `https://orcid.org/0000-0002-1825-0097` (for an individual), `https://orcid.org/0000-0002-1825-0097 | https://orcid.org/0000-0002-1825-0098` (for a list of people).
Refines: None
Replaces: None
ABCD 2.06: not in ABCD
|
process
|
new term identifiedbyid new term submitter tim robertson on behalf of dshorthouse justification there is no way to identify individuals by e g orcids proponents gbif already in production cetaf discco definition a list concatenated and separated of the globally unique identifiers for the person people groups or organizations responsible for assigning the taxon to the subject comment recommended best practice is to provide a single identifier that disambiguates the details of the identifying agent if a list is used the order of the identifiers on the list should not be assumed to convey any semantics recommended best practice is to separate the values in a list with space vertical bar space examples for an individual for a list of people refines none replaces none abcd not in abcd
| 1
|
184,864
| 6,717,043,930
|
IssuesEvent
|
2017-10-14 16:27:39
|
dalaranwow/dalaran-wow
|
https://api.github.com/repos/dalaranwow/dalaran-wow
|
closed
|
[Spell][Buff] Mark of the Wild/Gift of the Wild Resistance Bonus
|
Class - Druid Fixed - On Live Server Priority - High
|
- **Issue present:** Mark/Gift of the Wild no longer provides Resistance bonus after the recent Update (14/10/17)
- **How it should work:** Any level of Mark/Gift of the Wild should provide suitable Resistance bonus.
- **ID's:**
```
1126 - [Mark of the Wild, rank 1 enUS]
5232 - [Mark of the Wild, rank 2 enUS]
5234 - [Mark of the Wild, rank 4 enUS]
6756 - [Mark of the Wild, rank 3 enUS]
8907 - [Mark of the Wild, rank 5 enUS]
9884 - [Mark of the Wild, rank 6 enUS]
9885 - [Mark of the Wild, rank 7 enUS]
21849 - [Gift of the Wild, rank 1 enUS]
21850 - [Gift of the Wild, rank 2 enUS]
26991 - [Gift of the Wild, rank 3 enUS]
48470 - [Gift of the Wild, rank 4 enUS]
```
----
Edit: It is only Gift/Mark but also any other spell providing resistance
Fix incoming.
|
1.0
|
[Spell][Buff] Mark of the Wild/Gift of the Wild Resistance Bonus - - **Issue present:** Mark/Gift of the Wild no longer provides Resistance bonus after the recent Update (14/10/17)
- **How it should work:** Any level of Mark/Gift of the Wild should provide suitable Resistance bonus.
- **ID's:**
```
1126 - [Mark of the Wild, rank 1 enUS]
5232 - [Mark of the Wild, rank 2 enUS]
5234 - [Mark of the Wild, rank 4 enUS]
6756 - [Mark of the Wild, rank 3 enUS]
8907 - [Mark of the Wild, rank 5 enUS]
9884 - [Mark of the Wild, rank 6 enUS]
9885 - [Mark of the Wild, rank 7 enUS]
21849 - [Gift of the Wild, rank 1 enUS]
21850 - [Gift of the Wild, rank 2 enUS]
26991 - [Gift of the Wild, rank 3 enUS]
48470 - [Gift of the Wild, rank 4 enUS]
```
----
Edit: It is only Gift/Mark but also any other spell providing resistance
Fix incoming.
|
non_process
|
mark of the wild gift of the wild resistance bonus issue present mark gift of the wild no longer provides resistance bonus after the recent update how it should work any level of mark gift of the wild should provide suitable resistance bonus id s edit it is only gift mark but also any other spell providing resistance fix incoming
| 0
|
9,104
| 12,190,047,515
|
IssuesEvent
|
2020-04-29 08:37:53
|
threefoldfoundation/tft-stellar
|
https://api.github.com/repos/threefoldfoundation/tft-stellar
|
closed
|
How not to force https for the faucet
|
priority_minor process_wontfix type_question
|
Is it possible not to force https?
```sh
root@3bot:/sandbox/cfg/nginx/default_openresty_threebot/servers/default_website_80_locations# cat threefoldfoundation.stellar_faucet.conf
location /threefoldfoundation/stellar_faucet {
# to keep hostname (if it's with e.g. port)
absolute_redirect off;
# set host header if available
set $req_host $host;
if ($http_host) {
set $req_host $http_host;
}
# force http
if ($scheme = http) {
return 301 https://$host$request_uri;
}
```
|
1.0
|
How not to force https for the faucet - Is it possible not to force https?
```sh
root@3bot:/sandbox/cfg/nginx/default_openresty_threebot/servers/default_website_80_locations# cat threefoldfoundation.stellar_faucet.conf
location /threefoldfoundation/stellar_faucet {
# to keep hostname (if it's with e.g. port)
absolute_redirect off;
# set host header if available
set $req_host $host;
if ($http_host) {
set $req_host $http_host;
}
# force http
if ($scheme = http) {
return 301 https://$host$request_uri;
}
```
|
process
|
how not to force https for the faucet is it possible not to force https sh root sandbox cfg nginx default openresty threebot servers default website locations cat threefoldfoundation stellar faucet conf location threefoldfoundation stellar faucet to keep hostname if it s with e g port absolute redirect off set host header if available set req host host if http host set req host http host force http if scheme http return
| 1
|
619,662
| 19,532,073,035
|
IssuesEvent
|
2021-12-30 18:59:32
|
massenergize/frontend-portal
|
https://api.github.com/repos/massenergize/frontend-portal
|
closed
|
Console warning: AppRouter : Cannot update during an existing state transition
|
bug priority 2
|
index.js:1 Warning: Cannot update during an existing state transition (such as within `render`). Render methods should be a pure function of props and state.
in AppRouter (created by Connect(AppRouter))
in Connect(AppRouter) (created by Context.Consumer)
in Route (at App.js:85)
in Switch (at App.js:84)
in App (created by Connect(App))
in Connect(App) (at src/index.js:37)
in ScrollToTop (created by Context.Consumer)
in withRouter(ScrollToTop) (at src/index.js:36)
in Router (created by ConnectedRouter)
in ConnectedRouter (created by Context.Consumer)
in ConnectedRouterWithContext (created by Connect(ConnectedRouterWithContext))
in Connect(ConnectedRouterWithContext) (at src/index.js:35)
in ReactReduxFirebaseProvider (at src/index.js:34)
in Provider (at src/index.js:33)
|
1.0
|
Console warning: AppRouter : Cannot update during an existing state transition - index.js:1 Warning: Cannot update during an existing state transition (such as within `render`). Render methods should be a pure function of props and state.
in AppRouter (created by Connect(AppRouter))
in Connect(AppRouter) (created by Context.Consumer)
in Route (at App.js:85)
in Switch (at App.js:84)
in App (created by Connect(App))
in Connect(App) (at src/index.js:37)
in ScrollToTop (created by Context.Consumer)
in withRouter(ScrollToTop) (at src/index.js:36)
in Router (created by ConnectedRouter)
in ConnectedRouter (created by Context.Consumer)
in ConnectedRouterWithContext (created by Connect(ConnectedRouterWithContext))
in Connect(ConnectedRouterWithContext) (at src/index.js:35)
in ReactReduxFirebaseProvider (at src/index.js:34)
in Provider (at src/index.js:33)
|
non_process
|
console warning approuter cannot update during an existing state transition index js warning cannot update during an existing state transition such as within render render methods should be a pure function of props and state in approuter created by connect approuter in connect approuter created by context consumer in route at app js in switch at app js in app created by connect app in connect app at src index js in scrolltotop created by context consumer in withrouter scrolltotop at src index js in router created by connectedrouter in connectedrouter created by context consumer in connectedrouterwithcontext created by connect connectedrouterwithcontext in connect connectedrouterwithcontext at src index js in reactreduxfirebaseprovider at src index js in provider at src index js
| 0
|
137,958
| 18,769,550,346
|
IssuesEvent
|
2021-11-06 15:29:31
|
samqws-marketing/box_box-ui-elements
|
https://api.github.com/repos/samqws-marketing/box_box-ui-elements
|
opened
|
CVE-2020-7598 (Medium) detected in minimist-1.1.3.tgz, minimist-1.2.0.tgz
|
security vulnerability
|
## CVE-2020-7598 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-1.1.3.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary>
<p>
<details><summary><b>minimist-1.1.3.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.1.3.tgz">https://registry.npmjs.org/minimist/-/minimist-1.1.3.tgz</a></p>
<p>Path to dependency file: box_box-ui-elements/package.json</p>
<p>Path to vulnerable library: box_box-ui-elements/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- stylelint-12.0.0.tgz (Root Library)
- postcss-sass-0.4.2.tgz
- gonzales-pe-4.2.4.tgz
- :x: **minimist-1.1.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimist-1.2.0.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>Path to dependency file: box_box-ui-elements/package.json</p>
<p>Path to vulnerable library: box_box-ui-elements/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- react-5.3.9.tgz (Root Library)
- core-5.3.9.tgz
- json5-2.1.1.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution: minimist - 0.2.1,1.2.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.1.3","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"stylelint:12.0.0;postcss-sass:0.4.2;gonzales-pe:4.2.4;minimist:1.1.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"@storybook/react:5.3.9;@storybook/core:5.3.9;json5:2.1.1;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-7598 (Medium) detected in minimist-1.1.3.tgz, minimist-1.2.0.tgz - ## CVE-2020-7598 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-1.1.3.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary>
<p>
<details><summary><b>minimist-1.1.3.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.1.3.tgz">https://registry.npmjs.org/minimist/-/minimist-1.1.3.tgz</a></p>
<p>Path to dependency file: box_box-ui-elements/package.json</p>
<p>Path to vulnerable library: box_box-ui-elements/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- stylelint-12.0.0.tgz (Root Library)
- postcss-sass-0.4.2.tgz
- gonzales-pe-4.2.4.tgz
- :x: **minimist-1.1.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimist-1.2.0.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>Path to dependency file: box_box-ui-elements/package.json</p>
<p>Path to vulnerable library: box_box-ui-elements/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- react-5.3.9.tgz (Root Library)
- core-5.3.9.tgz
- json5-2.1.1.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution: minimist - 0.2.1,1.2.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.1.3","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"stylelint:12.0.0;postcss-sass:0.4.2;gonzales-pe:4.2.4;minimist:1.1.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"@storybook/react:5.3.9;@storybook/core:5.3.9;json5:2.1.1;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in minimist tgz minimist tgz cve medium severity vulnerability vulnerable libraries minimist tgz minimist tgz minimist tgz parse argument options library home page a href path to dependency file box box ui elements package json path to vulnerable library box box ui elements node modules minimist package json dependency hierarchy stylelint tgz root library postcss sass tgz gonzales pe tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file box box ui elements package json path to vulnerable library box box ui elements node modules minimist package json dependency hierarchy react tgz root library core tgz tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution minimist isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree stylelint postcss sass gonzales pe minimist isminimumfixversionavailable true minimumfixversion minimist packagetype javascript node js packagename minimist packageversion packagefilepaths istransitivedependency true dependencytree storybook react storybook core minimist isminimumfixversionavailable true minimumfixversion minimist basebranches vulnerabilityidentifier cve vulnerabilitydetails minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload vulnerabilityurl
| 0
|
351,061
| 10,512,294,344
|
IssuesEvent
|
2019-09-27 17:32:15
|
digidem/mapeo-mobile
|
https://api.github.com/repos/digidem/mapeo-mobile
|
opened
|
Support skipping giving location permissions
|
help wanted priority
|
- Map should show with no location permissions, but ""locate me"" button should not show, and ""GPS pill"" at top of screen should be red as in https://www.notion.so/digidem/MM-Beta-UX-Refined-UIs-56504c2e739d426d81201e7fb9b116e9#23a302da4b4d498eb6d4c1c5ade3f09f
- Pressing the red GPS pill should show screen with ""enable"" button which *either* requests the user for location permissions, or, if location permission has been denied, re-directs the user to settings where they can turn on location for mapeo. Screen design: https://www.notion.so/digidem/MM-Beta-UX-Refined-UIs-56504c2e739d426d81201e7fb9b116e9#e54cc85eacec4ec282ce4e944e14c235
|
1.0
|
Support skipping giving location permissions - - Map should show with no location permissions, but ""locate me"" button should not show, and ""GPS pill"" at top of screen should be red as in https://www.notion.so/digidem/MM-Beta-UX-Refined-UIs-56504c2e739d426d81201e7fb9b116e9#23a302da4b4d498eb6d4c1c5ade3f09f
- Pressing the red GPS pill should show screen with ""enable"" button which *either* requests the user for location permissions, or, if location permission has been denied, re-directs the user to settings where they can turn on location for mapeo. Screen design: https://www.notion.so/digidem/MM-Beta-UX-Refined-UIs-56504c2e739d426d81201e7fb9b116e9#e54cc85eacec4ec282ce4e944e14c235
|
non_process
|
support skipping giving location permissions map should show with no location permissions but locate me button should not show and gps pill at top of screen should be red as in pressing the red gps pill should show screen with enable button which either requests the user for location permissions or if location permission has been denied re directs the user to settings where they can turn on location for mapeo screen design
| 0
|
173,667
| 27,510,173,678
|
IssuesEvent
|
2023-03-06 08:11:07
|
status-im/help.status.im
|
https://api.github.com/repos/status-im/help.status.im
|
closed
|
Adjust Status Help content width size
|
P:information-design P:platform
|
We need to check with the Design team what would be the recommended content width size in Status Help.
See [this discussion](https://github.com/squidfunk/mkdocs-material/discussions/2842) in the Material for MkDocs forum for information about how to control this parameter.
|
1.0
|
Adjust Status Help content width size - We need to check with the Design team what would be the recommended content width size in Status Help.
See [this discussion](https://github.com/squidfunk/mkdocs-material/discussions/2842) in the Material for MkDocs forum for information about how to control this parameter.
|
non_process
|
adjust status help content width size we need to check with the design team what would be the recommended content width size in status help see in the material for mkdocs forum for information about how to control this parameter
| 0
|
15,223
| 19,089,766,027
|
IssuesEvent
|
2021-11-29 10:45:00
|
RIOT-OS/RIOT
|
https://api.github.com/repos/RIOT-OS/RIOT
|
closed
|
periph/i2c: return values of i2c_aquire and i2c_release
|
Type: enhancement Process: API change Area: drivers State: don't stale
|
#### Description
I have read very carefully all related issues #6577 and PRs #6575, #6576 to figure out the answer to the following question:
For which purpose do the `i2c_acquire` and `i2c_release` function still have a return value. In fact they simply lock and unlock a mutex on all platforms. Since `i2c_acquire` blocks until the mutex can be locked, the function will not fail.
Some implementations check whether the device parameter is valid, but there is no reason to have a return parameter for it. The recommended way is to check the device parameter is using `assert`. In all other cases, all implementations simply return 0.
#### Disadvantages of the Current API
Since the API claims that the return value might be -1 in case of error, the return value would have to be checked.
The reality looks completely different. The return value is checked in less than 10 % (19 of 227) of `i2c_acquire` function calls and in 0% (0 of 380) `i2c_release` function calls.
#### Proposal
It is quite tedious to use something like the following over and over again just because the API defines return values, although the function can not cause an error.
```c
if (i2c_aquire(I2C_DEV(0)) != 0) {
...
}
```
The return values should be removed. Otherwise, all drivers would need a rework for the case that the function can fail.
|
1.0
|
periph/i2c: return values of i2c_aquire and i2c_release - #### Description
I have read very carefully all related issues #6577 and PRs #6575, #6576 to figure out the answer to the following question:
For which purpose do the `i2c_acquire` and `i2c_release` function still have a return value. In fact they simply lock and unlock a mutex on all platforms. Since `i2c_acquire` blocks until the mutex can be locked, the function will not fail.
Some implementations check whether the device parameter is valid, but there is no reason to have a return parameter for it. The recommended way is to check the device parameter is using `assert`. In all other cases, all implementations simply return 0.
#### Disadvantages of the Current API
Since the API claims that the return value might be -1 in case of error, the return value would have to be checked.
The reality looks completely different. The return value is checked in less than 10 % (19 of 227) of `i2c_acquire` function calls and in 0% (0 of 380) `i2c_release` function calls.
#### Proposal
It is quite tedious to use something like the following over and over again just because the API defines return values, although the function can not cause an error.
```c
if (i2c_aquire(I2C_DEV(0)) != 0) {
...
}
```
The return values should be removed. Otherwise, all drivers would need a rework for the case that the function can fail.
|
process
|
periph return values of aquire and release description i have read very carefully all related issues and prs to figure out the answer to the following question for which purpose do the acquire and release function still have a return value in fact they simply lock and unlock a mutex on all platforms since acquire blocks until the mutex can be locked the function will not fail some implementations check whether the device parameter is valid but there is no reason to have a return parameter for it the recommended way is to check the device parameter is using assert in all other cases all implementations simply return disadvantages of the current api since the api claims that the return value might be in case of error the return value would have to be checked the reality looks completely different the return value is checked in less than of of acquire function calls and in of release function calls proposal it is quite tedious to use something like the following over and over again just because the api defines return values although the function can not cause an error c if aquire dev the return values should be removed otherwise all drivers would need a rework for the case that the function can fail
| 1
|
15,556
| 19,703,503,309
|
IssuesEvent
|
2022-01-12 19:07:59
|
googleapis/java-policy-troubleshooter
|
https://api.github.com/repos/googleapis/java-policy-troubleshooter
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'policy-troubleshooter' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'policy-troubleshooter' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname policy troubleshooter invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
44,367
| 2,904,080,605
|
IssuesEvent
|
2015-06-18 16:21:47
|
ngds/ckanext-ngds
|
https://api.github.com/repos/ngds/ckanext-ngds
|
closed
|
Deploy service from Tier3 CSV file with link from harvested metadata
|
in progress priority.Blocking
|
see #610 for background
When a harvested record indicates that a Tier3 CSV data set is available for a resource, get the csv, validate it, and if valid, deploy an NGDS web service. The technical requirement are described in a document that is in the ngds/documents repository: https://github.com/ngds/documents/blob/master/GDR_integrationProjectRequirements.docx
The requirements for identifying the correct distribution don't account for multiple distribution options; and should have identified that a usgin: content model keyword would be present. We need to review what the application is looking for in the metadata record to identify the correct distribution link to get the csv file. @dano-reisys can you get that info?
|
1.0
|
Deploy service from Tier3 CSV file with link from harvested metadata - see #610 for background
When a harvested record indicates that a Tier3 CSV data set is available for a resource, get the csv, validate it, and if valid, deploy an NGDS web service. The technical requirement are described in a document that is in the ngds/documents repository: https://github.com/ngds/documents/blob/master/GDR_integrationProjectRequirements.docx
The requirements for identifying the correct distribution don't account for multiple distribution options; and should have identified that a usgin: content model keyword would be present. We need to review what the application is looking for in the metadata record to identify the correct distribution link to get the csv file. @dano-reisys can you get that info?
|
non_process
|
deploy service from csv file with link from harvested metadata see for background when a harvested record indicates that a csv data set is available for a resource get the csv validate it and if valid deploy an ngds web service the technical requirement are described in a document that is in the ngds documents repository the requirements for identifying the correct distribution don t account for multiple distribution options and should have identified that a usgin content model keyword would be present we need to review what the application is looking for in the metadata record to identify the correct distribution link to get the csv file dano reisys can you get that info
| 0
|
57,932
| 11,810,576,128
|
IssuesEvent
|
2020-03-19 16:42:35
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
opened
|
Do not hold SpinLock in fields marked as readonly
|
api-suggestion area-System.Threading code-analyzer
|
`SpinLock` is a mutable struct, meant only for advanced scenarios. Accidentally making a `SpinLock` field `readonly` can result in silent but significant problems, as any mutations to the instance (e.g. Enter, Exit) will be done on a compiler-generated copy and thus be ignored, making the lock an expensive nop. (It might make sense to extend this analyzer to additional mutable struct types where storing them in a `readonly` field is likely a bug, e.g. `GCHandle`.)
**Category**: Reliability
|
1.0
|
Do not hold SpinLock in fields marked as readonly - `SpinLock` is a mutable struct, meant only for advanced scenarios. Accidentally making a `SpinLock` field `readonly` can result in silent but significant problems, as any mutations to the instance (e.g. Enter, Exit) will be done on a compiler-generated copy and thus be ignored, making the lock an expensive nop. (It might make sense to extend this analyzer to additional mutable struct types where storing them in a `readonly` field is likely a bug, e.g. `GCHandle`.)
**Category**: Reliability
|
non_process
|
do not hold spinlock in fields marked as readonly spinlock is a mutable struct meant only for advanced scenarios accidentally making a spinlock field readonly can result in silent but significant problems as any mutations to the instance e g enter exit will be done on a compiler generated copy and thus be ignored making the lock an expensive nop it might make sense to extend this analyzer to additional mutable struct types where storing them in a readonly field is likely a bug e g gchandle category reliability
| 0
|
14,783
| 18,057,204,227
|
IssuesEvent
|
2021-09-20 09:45:13
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
opened
|
Tracking issue for Bazel 5.0 release
|
P1 type: process release
|
- [ ] Cut baseline commit (~mid October 2021)
- [ ] Further steps TBA
|
1.0
|
Tracking issue for Bazel 5.0 release - - [ ] Cut baseline commit (~mid October 2021)
- [ ] Further steps TBA
|
process
|
tracking issue for bazel release cut baseline commit mid october further steps tba
| 1
|
101,082
| 30,863,061,675
|
IssuesEvent
|
2023-08-03 05:44:56
|
vuejs/vitepress
|
https://api.github.com/repos/vuejs/vitepress
|
closed
|
outDir logic is too confusing now
|
bug build
|
### Describe the bug
I'm trying to build a site in a custom folder and noticed several issues.
My site is located in the folder `sites/mysite.com`.
When I run following command in the root of my project:
```
npx vitepress build sites/mysite.com --outDir public
```
Instead of writing to ${workplaceFolder}/public it actually still resolves outDir relatively to sites/mySite.com so to make it working I need to use currently `../../public` or `$(pwd)/public` which are both too confusing because from CLI call it looks like i write to something above.
My suggestion is that relative path needs to be resolved relatively to cwd, not a docs folder.
But even like that what I find even more strange - this setting only impacts assets, while actual html pages are still located in the .vitepress/dist folder. Do you know how to fix that too? THanks!
### Reproduction
Just create a nested project like sites/test.site and try to build it to a public/test.site folder in your root.
### Expected behavior
- command like `vitepress build path/to/my/site --outDir public` resolves to a public folder in your root - not in the package.
- html pages should be also built respectively to outDir parameter
### System Info
```sh
System:
OS: Linux 5.15 Debian GNU/Linux 11 (bullseye) 11 (bullseye)
CPU: (12) x64 12th Gen Intel(R) Core(TM) i7-1265U
Memory: 11.75 GB / 15.34 GB
Container: Yes
Shell: 5.1.4 - /bin/bash
Binaries:
Node: 20.3.1 - /usr/local/bin/node
Yarn: 1.22.19 - /usr/local/bin/yarn
npm: 9.6.7 - /usr/local/bin/npm
pnpm: 8.6.6 - /usr/local/share/npm-global/bin/pnpm
npmPackages:
vitepress: ^1.0.0-beta.6 => 1.0.0-beta.6
```
### Additional context
_No response_
### Validations
- [X] Check if you're on the [latest VitePress version](https://github.com/vuejs/vitepress/releases/latest).
- [X] Follow our [Code of Conduct](https://vuejs.org/about/coc.html)
- [X] Read the [docs](https://vitepress.dev).
- [X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
|
1.0
|
outDir logic is too confusing now - ### Describe the bug
I'm trying to build a site in a custom folder and noticed several issues.
My site is located in the folder `sites/mysite.com`.
When I run following command in the root of my project:
```
npx vitepress build sites/mysite.com --outDir public
```
Instead of writing to ${workplaceFolder}/public it actually still resolves outDir relatively to sites/mySite.com so to make it working I need to use currently `../../public` or `$(pwd)/public` which are both too confusing because from CLI call it looks like i write to something above.
My suggestion is that relative path needs to be resolved relatively to cwd, not a docs folder.
But even like that what I find even more strange - this setting only impacts assets, while actual html pages are still located in the .vitepress/dist folder. Do you know how to fix that too? THanks!
### Reproduction
Just create a nested project like sites/test.site and try to build it to a public/test.site folder in your root.
### Expected behavior
- command like `vitepress build path/to/my/site --outDir public` resolves to a public folder in your root - not in the package.
- html pages should be also built respectively to outDir parameter
### System Info
```sh
System:
OS: Linux 5.15 Debian GNU/Linux 11 (bullseye) 11 (bullseye)
CPU: (12) x64 12th Gen Intel(R) Core(TM) i7-1265U
Memory: 11.75 GB / 15.34 GB
Container: Yes
Shell: 5.1.4 - /bin/bash
Binaries:
Node: 20.3.1 - /usr/local/bin/node
Yarn: 1.22.19 - /usr/local/bin/yarn
npm: 9.6.7 - /usr/local/bin/npm
pnpm: 8.6.6 - /usr/local/share/npm-global/bin/pnpm
npmPackages:
vitepress: ^1.0.0-beta.6 => 1.0.0-beta.6
```
### Additional context
_No response_
### Validations
- [X] Check if you're on the [latest VitePress version](https://github.com/vuejs/vitepress/releases/latest).
- [X] Follow our [Code of Conduct](https://vuejs.org/about/coc.html)
- [X] Read the [docs](https://vitepress.dev).
- [X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
|
non_process
|
outdir logic is too confusing now describe the bug i m trying to build a site in a custom folder and noticed several issues my site is located in the folder sites mysite com when i run following command in the root of my project npx vitepress build sites mysite com outdir public instead of writing to workplacefolder public it actually still resolves outdir relatively to sites mysite com so to make it working i need to use currently public or pwd public which are both too confusing because from cli call it looks like i write to something above my suggestion is that relative path needs to be resolved relatively to cwd not a docs folder but even like that what i find even more strange this setting only impacts assets while actual html pages are still located in the vitepress dist folder do you know how to fix that too thanks reproduction just create a nested project like sites test site and try to build it to a public test site folder in your root expected behavior command like vitepress build path to my site outdir public resolves to a public folder in your root not in the package html pages should be also built respectively to outdir parameter system info sh system os linux debian gnu linux bullseye bullseye cpu gen intel r core tm memory gb gb container yes shell bin bash binaries node usr local bin node yarn usr local bin yarn npm usr local bin npm pnpm usr local share npm global bin pnpm npmpackages vitepress beta beta additional context no response validations check if you re on the follow our read the check that there isn t already an issue that reports the same bug to avoid creating a duplicate
| 0
|
6,619
| 6,546,805,172
|
IssuesEvent
|
2017-09-04 12:03:21
|
FlowFX/unkenmathe.de
|
https://api.github.com/repos/FlowFX/unkenmathe.de
|
closed
|
Check ob Sessions benutzt werden oder nicht
|
improvement security
|
Werden anscheinend momentan nicht genutzt:
vgl. https://www.ponycheckup.com/result/?url=https%3A%2F%2Fwww.unkenmathe.de%2F
|
True
|
Check ob Sessions benutzt werden oder nicht - Werden anscheinend momentan nicht genutzt:
vgl. https://www.ponycheckup.com/result/?url=https%3A%2F%2Fwww.unkenmathe.de%2F
|
non_process
|
check ob sessions benutzt werden oder nicht werden anscheinend momentan nicht genutzt vgl
| 0
|
9,505
| 3,050,965,584
|
IssuesEvent
|
2015-08-12 03:52:53
|
hylang/hy
|
https://api.github.com/repos/hylang/hy
|
closed
|
defn has no tests
|
backend good-first-bug missing-tests
|
I can't seem to fin any tests of defn. I have found some defn behaviour that is at least weird:
I should not be able to use anything but a HySymbol as name for a function. I can currently use:
* a string (defn "hy" [] 1) (this at least can later be called withouth quotes, which might be seem as inconsistent)
* a lambda list. (defn &hy [] 1) This can't later be called.
* any other new HyType I define that is a string (see my keywords pr #295, it can be used too) e.g.: (defn %hy [] 1)
* keywords. (defn :hy [] 1) Although this one fails due to other reasons.
Insonsistent behaviour:
* (defn if [] 1) should either fail or override if macro? currently it doesn't fail but behavior is the one from "if" macro.
I was planning to add some tests, but I'm not sure where should be added and if any of those is a non-issue
|
1.0
|
defn has no tests - I can't seem to fin any tests of defn. I have found some defn behaviour that is at least weird:
I should not be able to use anything but a HySymbol as name for a function. I can currently use:
* a string (defn "hy" [] 1) (this at least can later be called withouth quotes, which might be seem as inconsistent)
* a lambda list. (defn &hy [] 1) This can't later be called.
* any other new HyType I define that is a string (see my keywords pr #295, it can be used too) e.g.: (defn %hy [] 1)
* keywords. (defn :hy [] 1) Although this one fails due to other reasons.
Insonsistent behaviour:
* (defn if [] 1) should either fail or override if macro? currently it doesn't fail but behavior is the one from "if" macro.
I was planning to add some tests, but I'm not sure where should be added and if any of those is a non-issue
|
non_process
|
defn has no tests i can t seem to fin any tests of defn i have found some defn behaviour that is at least weird i should not be able to use anything but a hysymbol as name for a function i can currently use a string defn hy this at least can later be called withouth quotes which might be seem as inconsistent a lambda list defn hy this can t later be called any other new hytype i define that is a string see my keywords pr it can be used too e g defn hy keywords defn hy although this one fails due to other reasons insonsistent behaviour defn if should either fail or override if macro currently it doesn t fail but behavior is the one from if macro i was planning to add some tests but i m not sure where should be added and if any of those is a non issue
| 0
|
179,827
| 13,905,826,738
|
IssuesEvent
|
2020-10-20 10:23:59
|
YaccConstructor/Brahma.FSharp
|
https://api.github.com/repos/YaccConstructor/Brahma.FSharp
|
closed
|
Test 1. Специализатор для OpenCL C
|
CSC Test
|
Для тех, кто планирует заниматься созданием примеров для экспериментального исследования.
Реализуйте параллельное транспонирование матрицы на GPGPU с использованием Brahma.FSharp. Используйте пакет с NuGet.
Результат: ссылка на репозиторий с решением в комментариях к этому issue.
|
1.0
|
Test 1. Специализатор для OpenCL C - Для тех, кто планирует заниматься созданием примеров для экспериментального исследования.
Реализуйте параллельное транспонирование матрицы на GPGPU с использованием Brahma.FSharp. Используйте пакет с NuGet.
Результат: ссылка на репозиторий с решением в комментариях к этому issue.
|
non_process
|
test специализатор для opencl c для тех кто планирует заниматься созданием примеров для экспериментального исследования реализуйте параллельное транспонирование матрицы на gpgpu с использованием brahma fsharp используйте пакет с nuget результат ссылка на репозиторий с решением в комментариях к этому issue
| 0
|
10,913
| 13,690,338,743
|
IssuesEvent
|
2020-09-30 14:16:09
|
elastic/beats
|
https://api.github.com/repos/elastic/beats
|
opened
|
[libbeat] add_process_metadata target namespace should be fully configurable
|
:Processors libbeat
|
`add_process_metadata` always includes `process.` in field names (e.g. `process.name`). Configuring the target namespace prefixes the field name with the configured value, but you cannot change the `process.*` part. This is a problem if you want to name the fields like `process.parent.name`.
- https://www.elastic.co/guide/en/beats/functionbeat/7.9/add-process-metadata.html
- https://github.com/elastic/beats/blob/5e69e25b920e3d93bec76a09a31da3ab35a55607/libbeat/processors/add_process_metadata/add_process_metadata.go#L253-L266
|
1.0
|
[libbeat] add_process_metadata target namespace should be fully configurable - `add_process_metadata` always includes `process.` in field names (e.g. `process.name`). Configuring the target namespace prefixes the field name with the configured value, but you cannot change the `process.*` part. This is a problem if you want to name the fields like `process.parent.name`.
- https://www.elastic.co/guide/en/beats/functionbeat/7.9/add-process-metadata.html
- https://github.com/elastic/beats/blob/5e69e25b920e3d93bec76a09a31da3ab35a55607/libbeat/processors/add_process_metadata/add_process_metadata.go#L253-L266
|
process
|
add process metadata target namespace should be fully configurable add process metadata always includes process in field names e g process name configuring the target namespace prefixes the field name with the configured value but you cannot change the process part this is a problem if you want to name the fields like process parent name
| 1
|
3,300
| 6,395,655,437
|
IssuesEvent
|
2017-08-04 13:47:48
|
itsyouonline/identityserver
|
https://api.github.com/repos/itsyouonline/identityserver
|
closed
|
inviting people based on email did not work for me
|
process_wontfix type_bug
|
- ask also Nickolay
- they get email then they get a page but no-where the invitation could be accepted
- how does this work if multiple emails or tel nr's be used, will it properly identify the right account if it already exists, can a user overrule that he already exists under other email?
|
1.0
|
inviting people based on email did not work for me -
- ask also Nickolay
- they get email then they get a page but no-where the invitation could be accepted
- how does this work if multiple emails or tel nr's be used, will it properly identify the right account if it already exists, can a user overrule that he already exists under other email?
|
process
|
inviting people based on email did not work for me ask also nickolay they get email then they get a page but no where the invitation could be accepted how does this work if multiple emails or tel nr s be used will it properly identify the right account if it already exists can a user overrule that he already exists under other email
| 1
|
714
| 3,052,182,218
|
IssuesEvent
|
2015-08-12 13:30:06
|
mesosphere/marathon
|
https://api.github.com/repos/mesosphere/marathon
|
closed
|
Kill & Scale, followed by scale-up does not work.
|
bug service
|
On Marathon 0.8.0, when we have multiple instances of an app deployed, and we select one of the instance and use the "kill & Scale" button on the UI, that particular instance of the app is destroyed. But when we try to scale up the instance by one, marathon does not react to it, although I see the request in marathon logs, there is no activity on mesos master and the mesos slave in context.
However, now if I scale down the number of apps using the scale button which choses one app and kills it, and then scale up. I am able to recover my original number of instances.
Tried this with two different deployment of Mesos/Marathon clusters and I see the same behavior.
|
1.0
|
Kill & Scale, followed by scale-up does not work. - On Marathon 0.8.0, when we have multiple instances of an app deployed, and we select one of the instance and use the "kill & Scale" button on the UI, that particular instance of the app is destroyed. But when we try to scale up the instance by one, marathon does not react to it, although I see the request in marathon logs, there is no activity on mesos master and the mesos slave in context.
However, now if I scale down the number of apps using the scale button which choses one app and kills it, and then scale up. I am able to recover my original number of instances.
Tried this with two different deployment of Mesos/Marathon clusters and I see the same behavior.
|
non_process
|
kill scale followed by scale up does not work on marathon when we have multiple instances of an app deployed and we select one of the instance and use the kill scale button on the ui that particular instance of the app is destroyed but when we try to scale up the instance by one marathon does not react to it although i see the request in marathon logs there is no activity on mesos master and the mesos slave in context however now if i scale down the number of apps using the scale button which choses one app and kills it and then scale up i am able to recover my original number of instances tried this with two different deployment of mesos marathon clusters and i see the same behavior
| 0
|
22,219
| 30,768,985,553
|
IssuesEvent
|
2023-07-30 17:07:18
|
km4ack/73Linux
|
https://api.github.com/repos/km4ack/73Linux
|
closed
|
vk4gra Graham
|
in process
|
I am installing 73linux on a fresh linux mint I just downloaded. Please do not get upset. and I know you havn't released it . A few things that I have detected is that " GIT' did not come standard in the Linux mint. It no problem .
Graham vk4gra
|
1.0
|
vk4gra Graham - I am installing 73linux on a fresh linux mint I just downloaded. Please do not get upset. and I know you havn't released it . A few things that I have detected is that " GIT' did not come standard in the Linux mint. It no problem .
Graham vk4gra
|
process
|
graham i am installing on a fresh linux mint i just downloaded please do not get upset and i know you havn t released it a few things that i have detected is that git did not come standard in the linux mint it no problem graham
| 1
|
7,976
| 11,167,475,333
|
IssuesEvent
|
2019-12-27 17:18:00
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR and rename term for effector mediated immune responses
|
New term request multi-species process
|
replaces ticket
https://github.com/geneontology/go-ontology/issues/18001
- [ ] GO:0034053 modulation by symbiont of host defense-related programmed cell death
Definition: Any process in which a symbiont modulates the frequency, rate or extent of defense-related programmed cell death in the host organism. The host is defined as the larger of the organisms involved in a symbiotic interaction.
5 EXP annotations
I checked these 5 papers and these ALL refer to
"Effector-mediated mitigation of host-defence related programmed cell death"
Can we change this terms name to "Effector-mediated mitigation of host-defence related programmed cell death", this will really help annotators and the "effector-mediated" alteration of host is a really important concept in the pathogen community.
- [ ] GO:0034055 positive regulation by symbiont of host defense-related programmed cell death
-> effector-mediated activation of host-defence related programmed cell death
- [ ] GO:0034054 negative regulation by symbiont of host defense-related programmed cell dea
-> effector-mediated supression of host-defence related programmed cell death
|
1.0
|
NTR and rename term for effector mediated immune responses - replaces ticket
https://github.com/geneontology/go-ontology/issues/18001
- [ ] GO:0034053 modulation by symbiont of host defense-related programmed cell death
Definition: Any process in which a symbiont modulates the frequency, rate or extent of defense-related programmed cell death in the host organism. The host is defined as the larger of the organisms involved in a symbiotic interaction.
5 EXP annotations
I checked these 5 papers and these ALL refer to
"Effector-mediated mitigation of host-defence related programmed cell death"
Can we change this terms name to "Effector-mediated mitigation of host-defence related programmed cell death", this will really help annotators and the "effector-mediated" alteration of host is a really important concept in the pathogen community.
- [ ] GO:0034055 positive regulation by symbiont of host defense-related programmed cell death
-> effector-mediated activation of host-defence related programmed cell death
- [ ] GO:0034054 negative regulation by symbiont of host defense-related programmed cell dea
-> effector-mediated supression of host-defence related programmed cell death
|
process
|
ntr and rename term for effector mediated immune responses replaces ticket go modulation by symbiont of host defense related programmed cell death definition any process in which a symbiont modulates the frequency rate or extent of defense related programmed cell death in the host organism the host is defined as the larger of the organisms involved in a symbiotic interaction exp annotations i checked these papers and these all refer to effector mediated mitigation of host defence related programmed cell death can we change this terms name to effector mediated mitigation of host defence related programmed cell death this will really help annotators and the effector mediated alteration of host is a really important concept in the pathogen community go positive regulation by symbiont of host defense related programmed cell death effector mediated activation of host defence related programmed cell death go negative regulation by symbiont of host defense related programmed cell dea effector mediated supression of host defence related programmed cell death
| 1
|
5,125
| 26,125,073,439
|
IssuesEvent
|
2022-12-28 17:20:03
|
hamcrest/JavaHamcrest
|
https://api.github.com/repos/hamcrest/JavaHamcrest
|
closed
|
Gather sources under a single root
|
maintainability
|
The source tree is organised into multiple source roots. This makes the build more complicated. Since 7.0 will be released as a single JAR (see #86), there is no need for different source roots.
|
True
|
Gather sources under a single root - The source tree is organised into multiple source roots. This makes the build more complicated. Since 7.0 will be released as a single JAR (see #86), there is no need for different source roots.
|
non_process
|
gather sources under a single root the source tree is organised into multiple source roots this makes the build more complicated since will be released as a single jar see there is no need for different source roots
| 0
|
3,413
| 6,523,923,020
|
IssuesEvent
|
2017-08-29 10:34:07
|
w3c/w3process
|
https://api.github.com/repos/w3c/w3process
|
closed
|
Duplicate sentences in Chapter 6
|
Editorial improvements Process2018Candidate question
|
> Please note that *publishing* as used in this document refers to producing a version which is listed as a W3C Technical Report on its Technical Reports page https://www.w3.org/TR.
This sentence is in both [6.1](https://w3c.github.io/w3process/#rec-advance) and [6.2](https://w3c.github.io/w3process/#requirements-and-definitions). Is it intentional?
|
1.0
|
Duplicate sentences in Chapter 6 - > Please note that *publishing* as used in this document refers to producing a version which is listed as a W3C Technical Report on its Technical Reports page https://www.w3.org/TR.
This sentence is in both [6.1](https://w3c.github.io/w3process/#rec-advance) and [6.2](https://w3c.github.io/w3process/#requirements-and-definitions). Is it intentional?
|
process
|
duplicate sentences in chapter please note that publishing as used in this document refers to producing a version which is listed as a technical report on its technical reports page this sentence is in both and is it intentional
| 1
|
50,673
| 13,187,675,063
|
IssuesEvent
|
2020-08-13 04:11:43
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
closed
|
Truncated Energy No Examples Provided (Trac #1169)
|
Migrated from Trac combo reconstruction defect
|
Please provide at least one simple example of how truncated_energy module is used to calculate a particle's energy.
Make sure to comment the code extensively.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1169">https://code.icecube.wisc.edu/ticket/1169</a>, reported by jtatar and owned by jtatar</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "Please provide at least one simple example of how truncated_energy module is used to calculate a particle's energy.\n\nMake sure to comment the code extensively.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "Truncated Energy No Examples Provided",
"priority": "blocker",
"keywords": "",
"time": "2015-08-18T21:02:53",
"milestone": "",
"owner": "jtatar",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Truncated Energy No Examples Provided (Trac #1169) - Please provide at least one simple example of how truncated_energy module is used to calculate a particle's energy.
Make sure to comment the code extensively.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1169">https://code.icecube.wisc.edu/ticket/1169</a>, reported by jtatar and owned by jtatar</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "Please provide at least one simple example of how truncated_energy module is used to calculate a particle's energy.\n\nMake sure to comment the code extensively.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "Truncated Energy No Examples Provided",
"priority": "blocker",
"keywords": "",
"time": "2015-08-18T21:02:53",
"milestone": "",
"owner": "jtatar",
"type": "defect"
}
```
</p>
</details>
|
non_process
|
truncated energy no examples provided trac please provide at least one simple example of how truncated energy module is used to calculate a particle s energy make sure to comment the code extensively migrated from json status closed changetime description please provide at least one simple example of how truncated energy module is used to calculate a particle s energy n nmake sure to comment the code extensively reporter jtatar cc resolution fixed ts component combo reconstruction summary truncated energy no examples provided priority blocker keywords time milestone owner jtatar type defect
| 0
|
126,894
| 26,937,396,564
|
IssuesEvent
|
2023-02-07 21:58:00
|
phetsims/calculus-grapher
|
https://api.github.com/repos/phetsims/calculus-grapher
|
closed
|
Document options and fields
|
dev:code-review status:ready-for-review type:documentation
|
Make a pass through all files, document all options and fields whose semantics are not obvious.
|
1.0
|
Document options and fields - Make a pass through all files, document all options and fields whose semantics are not obvious.
|
non_process
|
document options and fields make a pass through all files document all options and fields whose semantics are not obvious
| 0
|
18,858
| 24,776,374,466
|
IssuesEvent
|
2022-10-23 19:35:27
|
OpenDataScotland/the_od_bods
|
https://api.github.com/repos/OpenDataScotland/the_od_bods
|
closed
|
National Library Scotland Multiple Data Downloads
|
data processing back end
|
We use a [web scraper on National Library Scotland](https://github.com/OpenDataScotland/the_od_bods/blob/main/web-scrapers/nls_scraper.py)'s pages.
The current scraper identifies the dataset as the first instance of a "Download" button if the button text matches "Download full dataset", "Download the dataset", "Download the data" or "Download sample dataset".
But some NLS pages may have multiple download buttons for multiple datasets, as in [this example](https://data.nls.uk/data/digitised-collections/british-army-lists/) where the dataset has been split into 4 parts. Currently, only the first asset/button will be returned ("Download sample dataset"), but ideally we should retrieve all assets/buttons, which is a signficant change to the scraper approach.
Potentially: find all buttons, maybe retain all which say "download" or "data" (?), then treat each download as an asset each. In this way a page/dataset title can have more than 1 asset, as is consistent with the rest of the ODS listing.
|
1.0
|
National Library Scotland Multiple Data Downloads - We use a [web scraper on National Library Scotland](https://github.com/OpenDataScotland/the_od_bods/blob/main/web-scrapers/nls_scraper.py)'s pages.
The current scraper identifies the dataset as the first instance of a "Download" button if the button text matches "Download full dataset", "Download the dataset", "Download the data" or "Download sample dataset".
But some NLS pages may have multiple download buttons for multiple datasets, as in [this example](https://data.nls.uk/data/digitised-collections/british-army-lists/) where the dataset has been split into 4 parts. Currently, only the first asset/button will be returned ("Download sample dataset"), but ideally we should retrieve all assets/buttons, which is a signficant change to the scraper approach.
Potentially: find all buttons, maybe retain all which say "download" or "data" (?), then treat each download as an asset each. In this way a page/dataset title can have more than 1 asset, as is consistent with the rest of the ODS listing.
|
process
|
national library scotland multiple data downloads we use a pages the current scraper identifies the dataset as the first instance of a download button if the button text matches download full dataset download the dataset download the data or download sample dataset but some nls pages may have multiple download buttons for multiple datasets as in where the dataset has been split into parts currently only the first asset button will be returned download sample dataset but ideally we should retrieve all assets buttons which is a signficant change to the scraper approach potentially find all buttons maybe retain all which say download or data then treat each download as an asset each in this way a page dataset title can have more than asset as is consistent with the rest of the ods listing
| 1
|
140,657
| 32,043,723,986
|
IssuesEvent
|
2023-09-22 22:03:37
|
aws/aws-sdk
|
https://api.github.com/repos/aws/aws-sdk
|
closed
|
AWS CodeArtifact package lifecycle: add publishedTime attribute to PackageVersionSummary object
|
feature-request investigating codeartifact
|
### Describe the feature
When we call the [ListPackageVersions](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_ListPackageVersions.html) operation, it returns a list of [PackageVersionSummary](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_PackageVersionSummary.html) objects. This object is pretty much like the [PackageVersionDescription](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_PackageVersionDescription.html), however with less attributes. I'd like to include the PackageVersionDescription `.packageVersion.publishedTime` to the PackageVersionSummary `.versions`.
### Use Case
When writing scripts to clean/purge packages (manging a custom lifecycle for artifacts/packages), let's say we want to have only the 3 latests packages versions. In order to delete older versions, we have to loop through packages to get packages versions and then loop through packages versions (ListPackageVersions) to get the package versions details (PackageVersionDescription) and find the publishedTime in order to sort all packages. Problem is, this requires a API describe call to every and each version, and it takes a long time depending on how much versions you have for every single artifact package.
### Proposed Solution
I propose to include the key `.packageVersion.publishedTime` from the `PackageVersionDescription` to the object `PackageVersionSummary` (`.versions.publishedTime`).
Now, the `aws codeartifact list-package-versions` outputs as follows:
```json
{
"versions": [
{
"version": "1.0.0-STAGING.11",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
},
{
"version": "1.1.0",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
}
],
"defaultDisplayVersion": "1.1.0",
"format": "npm",
"package": "XXXXXXXXXXXXXXXXXXXX",
"namespace": "XXXXXXXXXXXXXXXXXXXX"
}
```
And I can sort the packages by published time using:
```bash
for $p in $(aws codeartifact list-packages $OPTS | jq -r '.packages[].package'); do
for $pv in $(aws codeartifact list-package-versions $PACKAGE_OPTS | jq -r '.versions[].version'); do
aws codeartifact describe-package-version $PACKAGE_VERSION_OPTS | jq '.packageVersion.publishedTime'
done
done
```
Propose:
```json
{
"versions": [
{
"version": "1.0.0-STAGING.11",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"publishedTime": "2023-08-15T12:52:39.864000-03:00",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
},
{
"version": "1.1.0",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"publishedTime": "2023-08-15T12:52:39.864000-03:00",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
}
],
"defaultDisplayVersion": "1.1.0",
"format": "npm",
"package": "XXXXXXXXXXXXXXXXXXXX",
"namespace": "XXXXXXXXXXXXXXXXXXXX"
}
```
With this output, we could sort just by (sparing a lot of API calls):
```bash
for $p in $(aws codeartifact list-packages $OPTS | jq -r '.packages[].package'); do
aws codeartifact list-package-versions $PACKAGE_OPTS | jq '.versions[].publishedTime'
done
```
### Other Information
_No response_
### Acknowledgements
- [ ] I may be able to implement this feature request
- [ ] This feature might incur a breaking change
### CLI version used
aws-cli/2.12.6 Python/3.11.4 Linux/5.15.90.1-microsoft-standard-WSL2 exe/x86_64.ubuntu.20 prompt/off
### Environment details (OS name and version, etc.)
Linux DM-0028 5.15.90.1-microsoft-standard-WSL2 aws/aws-cli#1 SMP Fri Jan 27 02:56:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
|
1.0
|
AWS CodeArtifact package lifecycle: add publishedTime attribute to PackageVersionSummary object - ### Describe the feature
When we call the [ListPackageVersions](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_ListPackageVersions.html) operation, it returns a list of [PackageVersionSummary](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_PackageVersionSummary.html) objects. This object is pretty much like the [PackageVersionDescription](https://docs.aws.amazon.com/codeartifact/latest/APIReference/API_PackageVersionDescription.html), however with less attributes. I'd like to include the PackageVersionDescription `.packageVersion.publishedTime` to the PackageVersionSummary `.versions`.
### Use Case
When writing scripts to clean/purge packages (manging a custom lifecycle for artifacts/packages), let's say we want to have only the 3 latests packages versions. In order to delete older versions, we have to loop through packages to get packages versions and then loop through packages versions (ListPackageVersions) to get the package versions details (PackageVersionDescription) and find the publishedTime in order to sort all packages. Problem is, this requires a API describe call to every and each version, and it takes a long time depending on how much versions you have for every single artifact package.
### Proposed Solution
I propose to include the key `.packageVersion.publishedTime` from the `PackageVersionDescription` to the object `PackageVersionSummary` (`.versions.publishedTime`).
Now, the `aws codeartifact list-package-versions` outputs as follows:
```json
{
"versions": [
{
"version": "1.0.0-STAGING.11",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
},
{
"version": "1.1.0",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
}
],
"defaultDisplayVersion": "1.1.0",
"format": "npm",
"package": "XXXXXXXXXXXXXXXXXXXX",
"namespace": "XXXXXXXXXXXXXXXXXXXX"
}
```
And I can sort the packages by published time using:
```bash
for $p in $(aws codeartifact list-packages $OPTS | jq -r '.packages[].package'); do
for $pv in $(aws codeartifact list-package-versions $PACKAGE_OPTS | jq -r '.versions[].version'); do
aws codeartifact describe-package-version $PACKAGE_VERSION_OPTS | jq '.packageVersion.publishedTime'
done
done
```
Propose:
```json
{
"versions": [
{
"version": "1.0.0-STAGING.11",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"publishedTime": "2023-08-15T12:52:39.864000-03:00",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
},
{
"version": "1.1.0",
"revision": "XXXXXXXXXXXXXXXXXXXX",
"status": "Published",
"publishedTime": "2023-08-15T12:52:39.864000-03:00",
"origin": {
"domainEntryPoint": {
"repositoryName": "npm-store"
},
"originType": "INTERNAL"
}
}
],
"defaultDisplayVersion": "1.1.0",
"format": "npm",
"package": "XXXXXXXXXXXXXXXXXXXX",
"namespace": "XXXXXXXXXXXXXXXXXXXX"
}
```
With this output, we could sort just by (sparing a lot of API calls):
```bash
for $p in $(aws codeartifact list-packages $OPTS | jq -r '.packages[].package'); do
aws codeartifact list-package-versions $PACKAGE_OPTS | jq '.versions[].publishedTime'
done
```
### Other Information
_No response_
### Acknowledgements
- [ ] I may be able to implement this feature request
- [ ] This feature might incur a breaking change
### CLI version used
aws-cli/2.12.6 Python/3.11.4 Linux/5.15.90.1-microsoft-standard-WSL2 exe/x86_64.ubuntu.20 prompt/off
### Environment details (OS name and version, etc.)
Linux DM-0028 5.15.90.1-microsoft-standard-WSL2 aws/aws-cli#1 SMP Fri Jan 27 02:56:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
|
non_process
|
aws codeartifact package lifecycle add publishedtime attribute to packageversionsummary object describe the feature when we call the operation it returns a list of objects this object is pretty much like the however with less attributes i d like to include the packageversiondescription packageversion publishedtime to the packageversionsummary versions use case when writing scripts to clean purge packages manging a custom lifecycle for artifacts packages let s say we want to have only the latests packages versions in order to delete older versions we have to loop through packages to get packages versions and then loop through packages versions listpackageversions to get the package versions details packageversiondescription and find the publishedtime in order to sort all packages problem is this requires a api describe call to every and each version and it takes a long time depending on how much versions you have for every single artifact package proposed solution i propose to include the key packageversion publishedtime from the packageversiondescription to the object packageversionsummary versions publishedtime now the aws codeartifact list package versions outputs as follows json versions version staging revision xxxxxxxxxxxxxxxxxxxx status published origin domainentrypoint repositoryname npm store origintype internal version revision xxxxxxxxxxxxxxxxxxxx status published origin domainentrypoint repositoryname npm store origintype internal defaultdisplayversion format npm package xxxxxxxxxxxxxxxxxxxx namespace xxxxxxxxxxxxxxxxxxxx and i can sort the packages by published time using bash for p in aws codeartifact list packages opts jq r packages package do for pv in aws codeartifact list package versions package opts jq r versions version do aws codeartifact describe package version package version opts jq packageversion publishedtime done done propose json versions version staging revision xxxxxxxxxxxxxxxxxxxx status published publishedtime origin domainentrypoint repositoryname npm store origintype internal version revision xxxxxxxxxxxxxxxxxxxx status published publishedtime origin domainentrypoint repositoryname npm store origintype internal defaultdisplayversion format npm package xxxxxxxxxxxxxxxxxxxx namespace xxxxxxxxxxxxxxxxxxxx with this output we could sort just by sparing a lot of api calls bash for p in aws codeartifact list packages opts jq r packages package do aws codeartifact list package versions package opts jq versions publishedtime done other information no response acknowledgements i may be able to implement this feature request this feature might incur a breaking change cli version used aws cli python linux microsoft standard exe ubuntu prompt off environment details os name and version etc linux dm microsoft standard aws aws cli smp fri jan utc gnu linux
| 0
|
145,507
| 22,703,949,283
|
IssuesEvent
|
2022-07-05 13:12:01
|
ZigZagExchange/frontend
|
https://api.github.com/repos/ZigZagExchange/frontend
|
closed
|
Trade UI - Search pair - Clicking favourite/star should not switch pair
|
uiRedesign
|
Only make user favourite the pair, don't make the user switch the pair
https://gyazo.com/432d965d25760931b22f766b13e71fef
|
1.0
|
Trade UI - Search pair - Clicking favourite/star should not switch pair - Only make user favourite the pair, don't make the user switch the pair
https://gyazo.com/432d965d25760931b22f766b13e71fef
|
non_process
|
trade ui search pair clicking favourite star should not switch pair only make user favourite the pair don t make the user switch the pair
| 0
|
16,262
| 20,841,749,220
|
IssuesEvent
|
2022-03-21 01:26:59
|
duxli/duxli-css
|
https://api.github.com/repos/duxli/duxli-css
|
closed
|
Process Improvement: Add Build Script
|
process
|
# Process Improvement: Add Build Script
We need a script to build the Sass into CSS.
For this issue, we do not need to worry about features like minification, auto-prefixing, etc.
## Proposed Change
1. Add `sass` to project dependencies
2. Add script in `package.json` to build Sass into CSS.
3. Add script in `package.json` for a watch build.
|
1.0
|
Process Improvement: Add Build Script - # Process Improvement: Add Build Script
We need a script to build the Sass into CSS.
For this issue, we do not need to worry about features like minification, auto-prefixing, etc.
## Proposed Change
1. Add `sass` to project dependencies
2. Add script in `package.json` to build Sass into CSS.
3. Add script in `package.json` for a watch build.
|
process
|
process improvement add build script process improvement add build script we need a script to build the sass into css for this issue we do not need to worry about features like minification auto prefixing etc proposed change add sass to project dependencies add script in package json to build sass into css add script in package json for a watch build
| 1
|
13,728
| 16,488,064,353
|
IssuesEvent
|
2021-05-24 21:15:20
|
icra/ecam
|
https://api.github.com/repos/icra/ecam
|
closed
|
Configuration | Adapt arrangement
|
in process
|
Dear Lluis, the configuration seems quite "crowded now"

It is unnecessary to have "Advanced settings". Please put the part of where to select the IPCC directly under "List of assessments" and above the first assessment.
Further, it should be default, that the "assessment table" is unfold:

Otherwise it is not intuitive to click on it twice. If you "Create a new assessment" and you select it, the table does not unfold. It only unfolds if you click on it twice. Therefore, it should be unfold by default (the first at least)
|
1.0
|
Configuration | Adapt arrangement - Dear Lluis, the configuration seems quite "crowded now"

It is unnecessary to have "Advanced settings". Please put the part of where to select the IPCC directly under "List of assessments" and above the first assessment.
Further, it should be default, that the "assessment table" is unfold:

Otherwise it is not intuitive to click on it twice. If you "Create a new assessment" and you select it, the table does not unfold. It only unfolds if you click on it twice. Therefore, it should be unfold by default (the first at least)
|
process
|
configuration adapt arrangement dear lluis the configuration seems quite crowded now it is unnecessary to have advanced settings please put the part of where to select the ipcc directly under list of assessments and above the first assessment further it should be default that the assessment table is unfold otherwise it is not intuitive to click on it twice if you create a new assessment and you select it the table does not unfold it only unfolds if you click on it twice therefore it should be unfold by default the first at least
| 1
|
606,158
| 18,756,049,028
|
IssuesEvent
|
2021-11-05 10:53:18
|
HYPERNETS/hypernets_processor
|
https://api.github.com/repos/HYPERNETS/hypernets_processor
|
closed
|
'Metadata' not found in sequence metadata.txt
|
bug priority: high
|
Any advice on how to fix this error?
`Processing sequence: /SEQ20210201T132732
Reading raw data...
seq /SEQ20210201T132732/metadata.txt
Failed: KeyError('Metadata')
Traceback (most recent call last):
File "/hypernets_processor/hypernets_processor/main/sequence_processor_main.py", line 126, in main
sp.process_sequence(target_sequence)
File "hypernets_processor/hypernets_processor/sequence_processor.py", line 105, in process_sequence
l0_irr,l0_rad,l0_bla,l0_swir_irr,l0_swir_rad,l0_swir_bla = reader.read_sequence(sequence_path,calibration_data_rad,calibration_data_irr,calibration_data_swir_rad,calibration_data_swir_irr)
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/data_io/hypernets_reader.py", line 896, in read_sequence
seq,lat,lon,cc,metadata,seriesIrr,seriesRad,seriesBlack,seriesPict,flag = self.read_metadata(
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/data_io/hypernets_reader.py", line 818, in read_metadata
globalattr = dict(metadata['Metadata'])
File "/Users/mms/opt/miniconda3/envs/hyProcessor/lib/python3.8/configparser.py", line 960, in __getitem__
raise KeyError(key)
KeyError: 'Metadata'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/mms/opt/miniconda3/envs/hyProcessor/bin/hypernets_sequence_processor", line 33, in <module>
sys.exit(load_entry_point('hypernets-processor', 'console_scripts', 'hypernets_sequence_processor')())
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/cli/sequence_processor_cli.py", line 122, in cli
main(processor_config_path=PROCESSOR_CONFIG_PATH, job_config_path=job_config_path, to_archive=False)
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/main/sequence_processor_main.py", line 132, in main
context.anomaly_db.add_x_anomaly()
AttributeError: 'NoneType' object has no attribute 'add_x_anomaly'`
|
1.0
|
'Metadata' not found in sequence metadata.txt - Any advice on how to fix this error?
`Processing sequence: /SEQ20210201T132732
Reading raw data...
seq /SEQ20210201T132732/metadata.txt
Failed: KeyError('Metadata')
Traceback (most recent call last):
File "/hypernets_processor/hypernets_processor/main/sequence_processor_main.py", line 126, in main
sp.process_sequence(target_sequence)
File "hypernets_processor/hypernets_processor/sequence_processor.py", line 105, in process_sequence
l0_irr,l0_rad,l0_bla,l0_swir_irr,l0_swir_rad,l0_swir_bla = reader.read_sequence(sequence_path,calibration_data_rad,calibration_data_irr,calibration_data_swir_rad,calibration_data_swir_irr)
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/data_io/hypernets_reader.py", line 896, in read_sequence
seq,lat,lon,cc,metadata,seriesIrr,seriesRad,seriesBlack,seriesPict,flag = self.read_metadata(
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/data_io/hypernets_reader.py", line 818, in read_metadata
globalattr = dict(metadata['Metadata'])
File "/Users/mms/opt/miniconda3/envs/hyProcessor/lib/python3.8/configparser.py", line 960, in __getitem__
raise KeyError(key)
KeyError: 'Metadata'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/mms/opt/miniconda3/envs/hyProcessor/bin/hypernets_sequence_processor", line 33, in <module>
sys.exit(load_entry_point('hypernets-processor', 'console_scripts', 'hypernets_sequence_processor')())
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/cli/sequence_processor_cli.py", line 122, in cli
main(processor_config_path=PROCESSOR_CONFIG_PATH, job_config_path=job_config_path, to_archive=False)
File "/Users/mms/Projects/hypernets_processor/hypernets_processor/main/sequence_processor_main.py", line 132, in main
context.anomaly_db.add_x_anomaly()
AttributeError: 'NoneType' object has no attribute 'add_x_anomaly'`
|
non_process
|
metadata not found in sequence metadata txt any advice on how to fix this error processing sequence reading raw data seq metadata txt failed keyerror metadata traceback most recent call last file hypernets processor hypernets processor main sequence processor main py line in main sp process sequence target sequence file hypernets processor hypernets processor sequence processor py line in process sequence irr rad bla swir irr swir rad swir bla reader read sequence sequence path calibration data rad calibration data irr calibration data swir rad calibration data swir irr file users mms projects hypernets processor hypernets processor data io hypernets reader py line in read sequence seq lat lon cc metadata seriesirr seriesrad seriesblack seriespict flag self read metadata file users mms projects hypernets processor hypernets processor data io hypernets reader py line in read metadata globalattr dict metadata file users mms opt envs hyprocessor lib configparser py line in getitem raise keyerror key keyerror metadata during handling of the above exception another exception occurred traceback most recent call last file users mms opt envs hyprocessor bin hypernets sequence processor line in sys exit load entry point hypernets processor console scripts hypernets sequence processor file users mms projects hypernets processor hypernets processor cli sequence processor cli py line in cli main processor config path processor config path job config path job config path to archive false file users mms projects hypernets processor hypernets processor main sequence processor main py line in main context anomaly db add x anomaly attributeerror nonetype object has no attribute add x anomaly
| 0
|
290,139
| 21,844,882,876
|
IssuesEvent
|
2022-05-18 03:01:35
|
adansons/base
|
https://api.github.com/repos/adansons/base
|
closed
|
increment version 0.1.0 -> 0.1.1
|
documentation
|
# Motivation
version number is still `0.1.0` in `pyproject.toml` and `base.__init__.py`
|
1.0
|
increment version 0.1.0 -> 0.1.1 - # Motivation
version number is still `0.1.0` in `pyproject.toml` and `base.__init__.py`
|
non_process
|
increment version motivation version number is still in pyproject toml and base init py
| 0
|
17,453
| 23,270,881,777
|
IssuesEvent
|
2022-08-04 22:55:54
|
MPMG-DCC-UFMG/C01
|
https://api.github.com/repos/MPMG-DCC-UFMG/C01
|
opened
|
Transparência - Profundidade máxima dos links
|
[1] Requisito [0] Desenvolvimento [2] Média Prioridade [3] Processamento Dinâmico
|
## Comportamento Esperado
Espera-se que a configuração da profundidade máxima de navegação de links se aplique também à navegação dinâmica.
## Comportamento Atual
A delimitação da profundidade de links da aba de Detalhes de coletor não se aplica à navegação do processamento dinâmico.
Na verdade, essa funcionalidade faz pouco sentido no sistema atual, porque a função de extração de links do Scrapy não se aplica às coletas dinâmicas, como reportado em #4794.
## Passos para reproduzir o erro
Não se aplica.
## Especificações da Coleta
Não se aplica.
## Sistema (caso necessário)
* MP ou local: ambos
* Branch específica: master
* Sistema diferente: não
## Screenshots (caso necessário)
Não se aplica.
|
1.0
|
Transparência - Profundidade máxima dos links - ## Comportamento Esperado
Espera-se que a configuração da profundidade máxima de navegação de links se aplique também à navegação dinâmica.
## Comportamento Atual
A delimitação da profundidade de links da aba de Detalhes de coletor não se aplica à navegação do processamento dinâmico.
Na verdade, essa funcionalidade faz pouco sentido no sistema atual, porque a função de extração de links do Scrapy não se aplica às coletas dinâmicas, como reportado em #4794.
## Passos para reproduzir o erro
Não se aplica.
## Especificações da Coleta
Não se aplica.
## Sistema (caso necessário)
* MP ou local: ambos
* Branch específica: master
* Sistema diferente: não
## Screenshots (caso necessário)
Não se aplica.
|
process
|
transparência profundidade máxima dos links comportamento esperado espera se que a configuração da profundidade máxima de navegação de links se aplique também à navegação dinâmica comportamento atual a delimitação da profundidade de links da aba de detalhes de coletor não se aplica à navegação do processamento dinâmico na verdade essa funcionalidade faz pouco sentido no sistema atual porque a função de extração de links do scrapy não se aplica às coletas dinâmicas como reportado em passos para reproduzir o erro não se aplica especificações da coleta não se aplica sistema caso necessário mp ou local ambos branch específica master sistema diferente não screenshots caso necessário não se aplica
| 1
|
182
| 2,588,443,812
|
IssuesEvent
|
2015-02-18 01:24:04
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
opened
|
Add test to verify expected set of environment variables
|
System.Diagnostics.Process
|
It looks like we have a test to verify that ProcessStartInfo gets the right environment variables, but we don't have a test to verify the environment of the child process itself.
|
1.0
|
Add test to verify expected set of environment variables - It looks like we have a test to verify that ProcessStartInfo gets the right environment variables, but we don't have a test to verify the environment of the child process itself.
|
process
|
add test to verify expected set of environment variables it looks like we have a test to verify that processstartinfo gets the right environment variables but we don t have a test to verify the environment of the child process itself
| 1
|
3,069
| 6,062,831,432
|
IssuesEvent
|
2017-06-14 10:23:41
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
[Process] wait() - proposal to softcode sleeping time
|
Feature Process
|
| Q | A
| ---------------- | -----
| Bug report? | no
| Feature request? | yes
| BC Break report? | no
| RFC? | no
| Symfony version | 3.2.8
The current implementation of `wait()` sleeps for 1 ms between process state checks. My Htop reports that routines which are in a wait loop consume about 2% of CPU time. Although that is ok for most tasks, I think that 2% is too much sometimes (when I got a lot of such waiting routines).
So I think that adding the option to change timeout time would be a nice idea. Looking forward to hear your thoughts about it.
|
1.0
|
[Process] wait() - proposal to softcode sleeping time - | Q | A
| ---------------- | -----
| Bug report? | no
| Feature request? | yes
| BC Break report? | no
| RFC? | no
| Symfony version | 3.2.8
The current implementation of `wait()` sleeps for 1 ms between process state checks. My Htop reports that routines which are in a wait loop consume about 2% of CPU time. Although that is ok for most tasks, I think that 2% is too much sometimes (when I got a lot of such waiting routines).
So I think that adding the option to change timeout time would be a nice idea. Looking forward to hear your thoughts about it.
|
process
|
wait proposal to softcode sleeping time q a bug report no feature request yes bc break report no rfc no symfony version the current implementation of wait sleeps for ms between process state checks my htop reports that routines which are in a wait loop consume about of cpu time although that is ok for most tasks i think that is too much sometimes when i got a lot of such waiting routines so i think that adding the option to change timeout time would be a nice idea looking forward to hear your thoughts about it
| 1
|
135,221
| 30,267,855,495
|
IssuesEvent
|
2023-07-07 13:14:42
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
servantcord 1.0.1 has 1 GuardDog issues
|
guarddog code-execution
|
https://pypi.org/project/servantcord
https://inspector.pypi.io/project/servantcord
```{
"dependency": "servantcord",
"version": "1.0.1",
"result": {
"issues": 1,
"errors": {},
"results": {
"code-execution": [
{
"location": "servantcord-1.0.1/setup.py:46",
"code": " subprocess.call(download_path)",
"message": "This package is executing OS commands in the setup.py file"
}
]
},
"path": "/tmp/tmp2hd6j3af/servantcord"
}
}```
|
1.0
|
servantcord 1.0.1 has 1 GuardDog issues - https://pypi.org/project/servantcord
https://inspector.pypi.io/project/servantcord
```{
"dependency": "servantcord",
"version": "1.0.1",
"result": {
"issues": 1,
"errors": {},
"results": {
"code-execution": [
{
"location": "servantcord-1.0.1/setup.py:46",
"code": " subprocess.call(download_path)",
"message": "This package is executing OS commands in the setup.py file"
}
]
},
"path": "/tmp/tmp2hd6j3af/servantcord"
}
}```
|
non_process
|
servantcord has guarddog issues dependency servantcord version result issues errors results code execution location servantcord setup py code subprocess call download path message this package is executing os commands in the setup py file path tmp servantcord
| 0
|
16,241
| 20,794,015,367
|
IssuesEvent
|
2022-03-17 07:12:32
|
arcus-azure/arcus.messaging
|
https://api.github.com/repos/arcus-azure/arcus.messaging
|
closed
|
When calling AddServiceBusQueueMessagePump multiple times only the first registered pump works
|
bug area:message-processing message-pumps
|
**Describe the bug**
I have the need to set up two queue message pumps pointing at a different service bus and queue. When I register the two pumps only the first registered works.
**To Reproduce**
Steps to reproduce the behavior:
In the service configuration set up the pumps:
```services.AddServiceBusQueueMessagePump(string queueA,
secretProvider => secretProvider.GetRawSecretAsync(SecretNames.CONNECTIONTOSB1),
messagePumpOptions =>
{
messagePumpOptions.Correlation.TransactionIdPropertyName = MessageHeaders.TransactionId;
messagePumpOptions.Deserialization.AdditionalMembers = AdditionalMemberHandling.Ignore;
})
.WithServiceBusMessageHandler<MessageHandlerX, MessageTypeX>();
services.AddServiceBusQueueMessagePump(string queueB,
secretProvider => secretProvider.GetRawSecretAsync(SecretNames.CONNECTIONTOSB2),
messagePumpOptions =>
{
messagePumpOptions.Correlation.TransactionIdPropertyName = MessageHeaders.TransactionId;
messagePumpOptions.Deserialization.AdditionalMembers = AdditionalMemberHandling.Ignore;
})
.WithServiceBusMessageHandler<MessageHandlerY, MessageTypeY>();
```
**Expected behavior**
The messages arriving in queueA ServiceBus1 are processed by message handler X
The messages arriving in queueB ServiceBus2 are processed by message handler Y
**Additional context**
Add any other context about the problem here.
- Version 1.1.0
|
1.0
|
When calling AddServiceBusQueueMessagePump multiple times only the first registered pump works - **Describe the bug**
I have the need to set up two queue message pumps pointing at a different service bus and queue. When I register the two pumps only the first registered works.
**To Reproduce**
Steps to reproduce the behavior:
In the service configuration set up the pumps:
```services.AddServiceBusQueueMessagePump(string queueA,
secretProvider => secretProvider.GetRawSecretAsync(SecretNames.CONNECTIONTOSB1),
messagePumpOptions =>
{
messagePumpOptions.Correlation.TransactionIdPropertyName = MessageHeaders.TransactionId;
messagePumpOptions.Deserialization.AdditionalMembers = AdditionalMemberHandling.Ignore;
})
.WithServiceBusMessageHandler<MessageHandlerX, MessageTypeX>();
services.AddServiceBusQueueMessagePump(string queueB,
secretProvider => secretProvider.GetRawSecretAsync(SecretNames.CONNECTIONTOSB2),
messagePumpOptions =>
{
messagePumpOptions.Correlation.TransactionIdPropertyName = MessageHeaders.TransactionId;
messagePumpOptions.Deserialization.AdditionalMembers = AdditionalMemberHandling.Ignore;
})
.WithServiceBusMessageHandler<MessageHandlerY, MessageTypeY>();
```
**Expected behavior**
The messages arriving in queueA ServiceBus1 are processed by message handler X
The messages arriving in queueB ServiceBus2 are processed by message handler Y
**Additional context**
Add any other context about the problem here.
- Version 1.1.0
|
process
|
when calling addservicebusqueuemessagepump multiple times only the first registered pump works describe the bug i have the need to set up two queue message pumps pointing at a different service bus and queue when i register the two pumps only the first registered works to reproduce steps to reproduce the behavior in the service configuration set up the pumps services addservicebusqueuemessagepump string queuea secretprovider secretprovider getrawsecretasync secretnames messagepumpoptions messagepumpoptions correlation transactionidpropertyname messageheaders transactionid messagepumpoptions deserialization additionalmembers additionalmemberhandling ignore withservicebusmessagehandler services addservicebusqueuemessagepump string queueb secretprovider secretprovider getrawsecretasync secretnames messagepumpoptions messagepumpoptions correlation transactionidpropertyname messageheaders transactionid messagepumpoptions deserialization additionalmembers additionalmemberhandling ignore withservicebusmessagehandler expected behavior the messages arriving in queuea are processed by message handler x the messages arriving in queueb are processed by message handler y additional context add any other context about the problem here version
| 1
|
133,749
| 29,513,159,723
|
IssuesEvent
|
2023-06-04 06:56:13
|
nim-lang/Nim
|
https://api.github.com/repos/nim-lang/Nim
|
closed
|
[cpp] segfault in a loop while parsing JSON
|
Severe C++ Code Generation
|
Hello. i'm new and learning. got this weird error while developing my application with --gc:orc
and compiling it with cpp backend `nim cpp -r --gc:orc qqc.nim`
### Example
```nim
import segfaults
import json
let j1 = """ {} """
let j2 = """ {"x":"1"} """
for i in 0..1:
stdout.write i,":"
try:
let node = parseJson((if i == 0: j1 else: j2))
echo node["x"]
except:
echo getCurrentExceptionMsg()
```
### Current Output
```
0:key not found: x
1:"1"
D:\Programm\Nim\Nim-devel\lib\system\fatal.nim(53) qqc
Error: unhandled exception: Could not access value because it is nil. [NilAccessDefect]
```
### Expected Output
```
0:key not found: x
1:"1"
```
### Possible Solution
* use --gc:refc
or
* use c backend with any gc
### Additional Information
```
$ nim -v
Nim Compiler Version 1.5.1 [Windows: amd64]
Compiled at 2021-07-21
Copyright (c) 2006-2021 by Andreas Rumpf
git hash: f86a530214d56f39664d6aaca3ddddf3b2fdba0c
active boot switches: -d:release
```
|
1.0
|
[cpp] segfault in a loop while parsing JSON - Hello. i'm new and learning. got this weird error while developing my application with --gc:orc
and compiling it with cpp backend `nim cpp -r --gc:orc qqc.nim`
### Example
```nim
import segfaults
import json
let j1 = """ {} """
let j2 = """ {"x":"1"} """
for i in 0..1:
stdout.write i,":"
try:
let node = parseJson((if i == 0: j1 else: j2))
echo node["x"]
except:
echo getCurrentExceptionMsg()
```
### Current Output
```
0:key not found: x
1:"1"
D:\Programm\Nim\Nim-devel\lib\system\fatal.nim(53) qqc
Error: unhandled exception: Could not access value because it is nil. [NilAccessDefect]
```
### Expected Output
```
0:key not found: x
1:"1"
```
### Possible Solution
* use --gc:refc
or
* use c backend with any gc
### Additional Information
```
$ nim -v
Nim Compiler Version 1.5.1 [Windows: amd64]
Compiled at 2021-07-21
Copyright (c) 2006-2021 by Andreas Rumpf
git hash: f86a530214d56f39664d6aaca3ddddf3b2fdba0c
active boot switches: -d:release
```
|
non_process
|
segfault in a loop while parsing json hello i m new and learning got this weird error while developing my application with gc orc and compiling it with cpp backend nim cpp r gc orc qqc nim example nim import segfaults import json let let x for i in stdout write i try let node parsejson if i else echo node except echo getcurrentexceptionmsg current output key not found x d programm nim nim devel lib system fatal nim qqc error unhandled exception could not access value because it is nil expected output key not found x possible solution use gc refc or use c backend with any gc additional information nim v nim compiler version compiled at copyright c by andreas rumpf git hash active boot switches d release
| 0
|
12,156
| 14,741,455,090
|
IssuesEvent
|
2021-01-07 10:38:49
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
SA Billing - Fair Oaks - Invalid Late Fees
|
anc-process anp-1 ant-bug has attachment
|
In GitLab by @kdjstudios on Jan 18, 2019, 08:37
**Submitted by:** "Vanessa Salamanca" <vanessa.salamanca@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/6612355
**Server:** Internal
**Client/Site:** Fair Oaks
**Account:** NA
**Issue:**
Please see attached approval for late credits that are invalid.
Approved to have this fixed and completed before next billing cycle.
[Copy+of+Fair+Oaks+1-15-19.xlsx](/uploads/e918f9a65adf55be4d5a494724fddeaa/Copy+of+Fair+Oaks+1-15-19.xlsx)
|
1.0
|
SA Billing - Fair Oaks - Invalid Late Fees - In GitLab by @kdjstudios on Jan 18, 2019, 08:37
**Submitted by:** "Vanessa Salamanca" <vanessa.salamanca@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/6612355
**Server:** Internal
**Client/Site:** Fair Oaks
**Account:** NA
**Issue:**
Please see attached approval for late credits that are invalid.
Approved to have this fixed and completed before next billing cycle.
[Copy+of+Fair+Oaks+1-15-19.xlsx](/uploads/e918f9a65adf55be4d5a494724fddeaa/Copy+of+Fair+Oaks+1-15-19.xlsx)
|
process
|
sa billing fair oaks invalid late fees in gitlab by kdjstudios on jan submitted by vanessa salamanca helpdesk server internal client site fair oaks account na issue please see attached approval for late credits that are invalid approved to have this fixed and completed before next billing cycle uploads copy of fair oaks xlsx
| 1
|
156,653
| 19,902,962,371
|
IssuesEvent
|
2022-01-25 09:52:59
|
sultanabubaker/basic-gradle-template
|
https://api.github.com/repos/sultanabubaker/basic-gradle-template
|
closed
|
CVE-2020-36185 (High) detected in jackson-databind-2.6.3.jar - autoclosed
|
security vulnerability
|
## CVE-2020-36185 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.3/5c4fcae53dd82e2c549b8322d78c6ff47c94c8a8/jackson-databind-2.6.3.jar</p>
<p>
Dependency Hierarchy:
- mmtf-serialization-1.0.8.jar (Root Library)
- jackson-dataformat-msgpack-0.7.1.jar
- :x: **jackson-databind-2.6.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sultanabubaker/basic-gradle-template/commit/b874bd2741628d035b5d06e7b312f127340324bf">b874bd2741628d035b5d06e7b312f127340324bf</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp2.datasources.SharedPoolDataSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36185>CVE-2020-36185</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2998">https://github.com/FasterXML/jackson-databind/issues/2998</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.rcsb:mmtf-serialization:1.0.8;org.msgpack:jackson-dataformat-msgpack:0.7.1;com.fasterxml.jackson.core:jackson-databind:2.6.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36185","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp2.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36185","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-36185 (High) detected in jackson-databind-2.6.3.jar - autoclosed - ## CVE-2020-36185 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.3/5c4fcae53dd82e2c549b8322d78c6ff47c94c8a8/jackson-databind-2.6.3.jar</p>
<p>
Dependency Hierarchy:
- mmtf-serialization-1.0.8.jar (Root Library)
- jackson-dataformat-msgpack-0.7.1.jar
- :x: **jackson-databind-2.6.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sultanabubaker/basic-gradle-template/commit/b874bd2741628d035b5d06e7b312f127340324bf">b874bd2741628d035b5d06e7b312f127340324bf</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp2.datasources.SharedPoolDataSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36185>CVE-2020-36185</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2998">https://github.com/FasterXML/jackson-databind/issues/2998</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.rcsb:mmtf-serialization:1.0.8;org.msgpack:jackson-dataformat-msgpack:0.7.1;com.fasterxml.jackson.core:jackson-databind:2.6.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36185","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.tomcat.dbcp.dbcp2.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36185","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy mmtf serialization jar root library jackson dataformat msgpack jar x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache tomcat dbcp datasources sharedpooldatasource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org rcsb mmtf serialization org msgpack jackson dataformat msgpack com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache tomcat dbcp datasources sharedpooldatasource vulnerabilityurl
| 0
|
6,995
| 10,143,484,897
|
IssuesEvent
|
2019-08-04 12:38:24
|
martinlindhe/wmi_exporter
|
https://api.github.com/repos/martinlindhe/wmi_exporter
|
closed
|
Process collector does not check errors for IIS WorkerProcesses
|
bug collector/process
|
This line should check for err
https://github.com/martinlindhe/wmi_exporter/blob/cb9da1ae222558c9f3039b782f00a44e17f2e4b3/collector/process.go#L194
We might want to consider adding the errcheck linter so this doesn't slip though again.
|
1.0
|
Process collector does not check errors for IIS WorkerProcesses - This line should check for err
https://github.com/martinlindhe/wmi_exporter/blob/cb9da1ae222558c9f3039b782f00a44e17f2e4b3/collector/process.go#L194
We might want to consider adding the errcheck linter so this doesn't slip though again.
|
process
|
process collector does not check errors for iis workerprocesses this line should check for err we might want to consider adding the errcheck linter so this doesn t slip though again
| 1
|
6,671
| 9,785,640,955
|
IssuesEvent
|
2019-06-09 09:24:55
|
EthVM/EthVM
|
https://api.github.com/repos/EthVM/EthVM
|
opened
|
After some time, fetching blocks decreases to 0
|
bug project:processing
|
* **I'm submitting a ...**
- [ ] feature request
- [X] bug report
* **Bug Report**
I run twice on two different Ropsten envs and after sometime the fetching of new blocks decreases to 0 and never recovers:
```
[2019-06-07 22:55:08,033] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,033] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,046] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,046] DEBUG Next range. Current tail = 913882, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,046] DEBUG Next range from chain tracker: 913882..913882 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,046] DEBUG Range = 913882..913882, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,046] DEBUG Fetching range: 913882..913882. Next = 913882..913882 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:08,910] DEBUG Parity request: 913882, 913882, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:08,925] DEBUG Total blocks fetched = 913883, fetch count = 1. Elapsed = 879 ms, target fetch = 1000 ms, % of target fetch = 0.879, trace count = 835, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,925] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,925] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,975] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,975] DEBUG Next range. Current tail = 913883, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,976] DEBUG Next range from chain tracker: 913883..913883 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,976] DEBUG Range = 913883..913883, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,976] DEBUG Fetching range: 913883..913883. Next = 913883..913883 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:10,174] DEBUG Parity request: 913883, 913883, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:10,188] DEBUG Total blocks fetched = 913884, fetch count = 1. Elapsed = 1212 ms, target fetch = 1000 ms, % of target fetch = 1.212, trace count = 836, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,188] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,188] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:10,210] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:10,210] DEBUG Next range. Current tail = 913884, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:10,210] DEBUG Next range from chain tracker: 913884..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:10,210] DEBUG Range = 913884..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,210] DEBUG Fetching range: 913884..913884. Next = 913884..913884 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:12,265] DEBUG Parity request: 913884, 913884, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:12,281] DEBUG Total blocks fetched = 913885, fetch count = 1. Elapsed = 2071 ms, target fetch = 1000 ms, % of target fetch = 2.071, trace count = 838, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,281] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,281] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,301] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,301] DEBUG Next range. Current tail = 913885, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,301] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,301] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,302] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,302] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,385] DEBUG Trying to reset tail to 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,385] DEBUG Current tail = 913885 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,385] DEBUG After reset attempt tail = 913885 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:13,302] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:13,302] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:13,302] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:14,302] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:14,303] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:14,303] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:14,303] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:14,303] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:14,303] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:15,303] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:15,303] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:15,303] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:15,303] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:15,303] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:15,303] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:16,304] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:16,304] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:16,304] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:16,304] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:16,304] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:16,304] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:17,304] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:17,304] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:17,304] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:17,304] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:17,305] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:17,305] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:18,305] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:18,305] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:18,305] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:18,305] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:18,305] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:18,305] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:19,305] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:19,306] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:19,306] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:19,306] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:19,306] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:19,306] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:20,306] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:20,307] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:20,307] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:20,307] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:20,307] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:20,307] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:21,307] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:21,307] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:21,307] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:21,307] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:21,307] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:21,307] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:22,308] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:22,308] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:22,309] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:22,309] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:22,309] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:22,309] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:23,309] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:23,309] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:23,309] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:23,309] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:23,309] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:23,309] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,309] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,310] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:24,310] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:24,310] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:24,310] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:24,310] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,538] WARN WorkerSinkTask{id=postgres-trace-sink-0} Commit of offsets timed out (org.apache.kafka.connect.runtime.WorkerSinkTask)
[2019-06-07 22:55:25,310] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:25,310] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:25,310] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[Another execution different from previous one]
[2019-06-08 13:01:48,342] DEBUG Total blocks fetched = 599402, fetch count = 1. Elapsed = 453 ms, target fetch = 1000 ms, % of target fetch = 0.453, trace count = 689, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,342] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,342] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,352] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,352] DEBUG Next range. Current tail = 599402, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,352] DEBUG Next range from chain tracker: 599402..599403 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,352] DEBUG Range = 599402..599403, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,352] DEBUG Fetching range: 599402..599402. Next = 599402..599402 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,352] DEBUG Fetching range: 599403..599403. Next = 599403..599403 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,421] DEBUG Parity request: 599402, 599402, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,563] DEBUG Parity request: 599403, 599403, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,593] DEBUG Total blocks fetched = 599404, fetch count = 2. Elapsed = 241 ms, target fetch = 1000 ms, % of target fetch = 0.241, trace count = 340, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,593] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,593] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,598] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,598] DEBUG Next range. Current tail = 599404, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,598] DEBUG Next range from chain tracker: 599404..599407 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,598] DEBUG Range = 599404..599407, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599404..599404. Next = 599404..599404 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599405..599405. Next = 599405..599405 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599406..599406. Next = 599406..599406 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599407..599407. Next = 599407..599407 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,851] DEBUG Parity request: 599404, 599404, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,910] DEBUG Parity request: 599405, 599405, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,940] DEBUG Parity request: 599406, 599406, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:49,958] DEBUG Parity request: 599407, 599407, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:50,169] DEBUG Total blocks fetched = 599408, fetch count = 4. Elapsed = 1571 ms, target fetch = 1000 ms, % of target fetch = 1.571, trace count = 4587, avg trace count = 4 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,169] DEBUG Reorg size = 0, records size = 21 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,169] DEBUG Polled 21 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:50,320] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:50,322] DEBUG Next range. Current tail = 599408, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:50,322] DEBUG Next range from chain tracker: 599408..599409 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:50,322] DEBUG Range = 599408..599409, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,323] DEBUG Fetching range: 599408..599408. Next = 599408..599408 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:50,323] DEBUG Fetching range: 599409..599409. Next = 599409..599409 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,095] DEBUG Parity request: 599409, 599409, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,361] DEBUG Parity request: 599408, 599408, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,534] DEBUG Total blocks fetched = 599410, fetch count = 2. Elapsed = 1212 ms, target fetch = 1000 ms, % of target fetch = 1.212, trace count = 8935, avg trace count = 16 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,534] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,534] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:51,666] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:51,666] DEBUG Next range. Current tail = 599410, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:51,666] DEBUG Next range from chain tracker: 599410..599411 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:51,666] DEBUG Range = 599410..599411, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,666] DEBUG Fetching range: 599411..599411. Next = 599411..599411 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,666] DEBUG Fetching range: 599410..599410. Next = 599410..599410 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:52,999] DEBUG Parity request: 599411, 599411, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:53,874] DEBUG Parity request: 599410, 599410, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:54,384] DEBUG Total blocks fetched = 599412, fetch count = 2. Elapsed = 2718 ms, target fetch = 1000 ms, % of target fetch = 2.718, trace count = 20348, avg trace count = 21 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,384] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,384] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:54,640] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:54,640] DEBUG Next range. Current tail = 599412, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:54,640] DEBUG Next range from chain tracker: 599412..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:54,640] DEBUG Range = 599412..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,640] DEBUG Fetching range: 599412..599412. Next = 599412..599412 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:55,872] DEBUG Parity request: 599412, 599412, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:56,157] DEBUG Total blocks fetched = 599413, fetch count = 1. Elapsed = 1517 ms, target fetch = 1000 ms, % of target fetch = 1.517, trace count = 8474, avg trace count = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,157] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,158] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:56,233] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:56,233] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:56,233] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:56,233] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,233] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,233] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:57,233] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:57,233] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:57,233] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:57,233] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:57,233] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:57,234] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:58,234] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:58,234] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:58,234] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:58,234] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:58,234] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:58,234] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:59,234] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:59,234] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:59,234] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:59,234] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:59,234] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:59,235] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:00,235] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:00,235] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:00,235] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:00,235] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:00,235] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:00,235] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:01,235] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:01,235] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:01,235] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:01,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:01,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:01,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:02,236] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:02,236] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:02,236] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:02,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:02,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:02,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:03,236] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:03,236] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:03,236] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:03,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:03,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:03,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:04,237] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:04,237] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:04,237] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:04,237] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:04,237] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:04,237] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,237] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,237] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,237] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,237] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:05,237] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:05,238] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,851] DEBUG Trying to reset tail to 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,852] DEBUG Current tail = 599413 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,852] DEBUG After reset attempt tail = 599413 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:06,238] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:06,238] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:06,238] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:07,241] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:07,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:07,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:07,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:07,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:07,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:08,242] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:08,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:08,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:08,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:08,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:08,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:09,242] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:09,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:09,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:09,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:09,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:09,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:10,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:10,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:10,243] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:10,243] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:10,243] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:10,243] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:11,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:11,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:11,243] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:11,243] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:11,243] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:11,243] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:12,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:12,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:12,244] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:12,244] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:12,244] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:12,244] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:13,244] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:13,244] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:13,244] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:13,244] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:13,244] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:13,244] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
```
|
1.0
|
After some time, fetching blocks decreases to 0 - * **I'm submitting a ...**
- [ ] feature request
- [X] bug report
* **Bug Report**
I run twice on two different Ropsten envs and after sometime the fetching of new blocks decreases to 0 and never recovers:
```
[2019-06-07 22:55:08,033] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,033] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,046] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,046] DEBUG Next range. Current tail = 913882, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,046] DEBUG Next range from chain tracker: 913882..913882 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,046] DEBUG Range = 913882..913882, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,046] DEBUG Fetching range: 913882..913882. Next = 913882..913882 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:08,910] DEBUG Parity request: 913882, 913882, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:08,925] DEBUG Total blocks fetched = 913883, fetch count = 1. Elapsed = 879 ms, target fetch = 1000 ms, % of target fetch = 0.879, trace count = 835, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,925] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,925] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,975] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:08,975] DEBUG Next range. Current tail = 913883, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,976] DEBUG Next range from chain tracker: 913883..913883 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:08,976] DEBUG Range = 913883..913883, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:08,976] DEBUG Fetching range: 913883..913883. Next = 913883..913883 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:10,174] DEBUG Parity request: 913883, 913883, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:10,188] DEBUG Total blocks fetched = 913884, fetch count = 1. Elapsed = 1212 ms, target fetch = 1000 ms, % of target fetch = 1.212, trace count = 836, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,188] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,188] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:10,210] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:10,210] DEBUG Next range. Current tail = 913884, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:10,210] DEBUG Next range from chain tracker: 913884..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:10,210] DEBUG Range = 913884..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:10,210] DEBUG Fetching range: 913884..913884. Next = 913884..913884 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:12,265] DEBUG Parity request: 913884, 913884, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-07 22:55:12,281] DEBUG Total blocks fetched = 913885, fetch count = 1. Elapsed = 2071 ms, target fetch = 1000 ms, % of target fetch = 2.071, trace count = 838, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,281] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,281] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,301] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,301] DEBUG Next range. Current tail = 913885, current head = 5750432 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,301] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,301] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,302] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:12,302] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:12,385] DEBUG Trying to reset tail to 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,385] DEBUG Current tail = 913885 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:12,385] DEBUG After reset attempt tail = 913885 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:13,302] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:13,302] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:13,302] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:13,302] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:14,302] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:14,303] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:14,303] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:14,303] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:14,303] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:14,303] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:15,303] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:15,303] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:15,303] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:15,303] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:15,303] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:15,303] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:16,304] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:16,304] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:16,304] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:16,304] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:16,304] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:16,304] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:17,304] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:17,304] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:17,304] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:17,304] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:17,305] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:17,305] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:18,305] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:18,305] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:18,305] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:18,305] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:18,305] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:18,305] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:19,305] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:19,306] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:19,306] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:19,306] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:19,306] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:19,306] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:20,306] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:20,307] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:20,307] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:20,307] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:20,307] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:20,307] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:21,307] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:21,307] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:21,307] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:21,307] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:21,307] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:21,307] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:22,308] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:22,308] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:22,309] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:22,309] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:22,309] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:22,309] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:23,309] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:23,309] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:23,309] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:23,309] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:23,309] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:23,309] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,309] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,310] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:24,310] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:24,310] DEBUG Range = 913885..913884, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:24,310] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-07 22:55:24,310] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:24,538] WARN WorkerSinkTask{id=postgres-trace-sink-0} Commit of offsets timed out (org.apache.kafka.connect.runtime.WorkerSinkTask)
[2019-06-07 22:55:25,310] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-07 22:55:25,310] DEBUG Next range. Current tail = 913885, current head = 5750433 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-07 22:55:25,310] DEBUG Next range from chain tracker: 913885..913884 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[Another execution different from previous one]
[2019-06-08 13:01:48,342] DEBUG Total blocks fetched = 599402, fetch count = 1. Elapsed = 453 ms, target fetch = 1000 ms, % of target fetch = 0.453, trace count = 689, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,342] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,342] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,352] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,352] DEBUG Next range. Current tail = 599402, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,352] DEBUG Next range from chain tracker: 599402..599403 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,352] DEBUG Range = 599402..599403, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,352] DEBUG Fetching range: 599402..599402. Next = 599402..599402 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,352] DEBUG Fetching range: 599403..599403. Next = 599403..599403 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,421] DEBUG Parity request: 599402, 599402, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,563] DEBUG Parity request: 599403, 599403, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,593] DEBUG Total blocks fetched = 599404, fetch count = 2. Elapsed = 241 ms, target fetch = 1000 ms, % of target fetch = 0.241, trace count = 340, avg trace count = 1 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,593] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,593] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,598] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:48,598] DEBUG Next range. Current tail = 599404, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,598] DEBUG Next range from chain tracker: 599404..599407 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:48,598] DEBUG Range = 599404..599407, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599404..599404. Next = 599404..599404 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599405..599405. Next = 599405..599405 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599406..599406. Next = 599406..599406 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,598] DEBUG Fetching range: 599407..599407. Next = 599407..599407 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,851] DEBUG Parity request: 599404, 599404, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,910] DEBUG Parity request: 599405, 599405, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:48,940] DEBUG Parity request: 599406, 599406, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:49,958] DEBUG Parity request: 599407, 599407, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:50,169] DEBUG Total blocks fetched = 599408, fetch count = 4. Elapsed = 1571 ms, target fetch = 1000 ms, % of target fetch = 1.571, trace count = 4587, avg trace count = 4 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,169] DEBUG Reorg size = 0, records size = 21 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,169] DEBUG Polled 21 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:50,320] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:50,322] DEBUG Next range. Current tail = 599408, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:50,322] DEBUG Next range from chain tracker: 599408..599409 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:50,322] DEBUG Range = 599408..599409, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:50,323] DEBUG Fetching range: 599408..599408. Next = 599408..599408 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:50,323] DEBUG Fetching range: 599409..599409. Next = 599409..599409 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,095] DEBUG Parity request: 599409, 599409, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,361] DEBUG Parity request: 599408, 599408, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,534] DEBUG Total blocks fetched = 599410, fetch count = 2. Elapsed = 1212 ms, target fetch = 1000 ms, % of target fetch = 1.212, trace count = 8935, avg trace count = 16 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,534] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,534] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:51,666] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:51,666] DEBUG Next range. Current tail = 599410, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:51,666] DEBUG Next range from chain tracker: 599410..599411 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:51,666] DEBUG Range = 599410..599411, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:51,666] DEBUG Fetching range: 599411..599411. Next = 599411..599411 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:51,666] DEBUG Fetching range: 599410..599410. Next = 599410..599410 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:52,999] DEBUG Parity request: 599411, 599411, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:53,874] DEBUG Parity request: 599410, 599410, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:54,384] DEBUG Total blocks fetched = 599412, fetch count = 2. Elapsed = 2718 ms, target fetch = 1000 ms, % of target fetch = 2.718, trace count = 20348, avg trace count = 21 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,384] DEBUG Reorg size = 0, records size = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,384] DEBUG Polled 11 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:54,640] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:54,640] DEBUG Next range. Current tail = 599412, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:54,640] DEBUG Next range from chain tracker: 599412..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:54,640] DEBUG Range = 599412..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:54,640] DEBUG Fetching range: 599412..599412. Next = 599412..599412 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:55,872] DEBUG Parity request: 599412, 599412, 1000 (com.ethvm.kafka.connect.sources.web3.sources.ParityFullBlockSource)
[2019-06-08 13:01:56,157] DEBUG Total blocks fetched = 599413, fetch count = 1. Elapsed = 1517 ms, target fetch = 1000 ms, % of target fetch = 1.517, trace count = 8474, avg trace count = 11 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,157] DEBUG Reorg size = 0, records size = 6 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,158] DEBUG Polled 6 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:56,233] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:56,233] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:56,233] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:56,233] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,233] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:56,233] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:57,233] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:57,233] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:57,233] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:57,233] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:57,233] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:57,234] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:58,234] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:58,234] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:58,234] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:58,234] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:58,234] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:58,234] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:59,234] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:01:59,234] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:59,234] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:01:59,234] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:59,234] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:01:59,235] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:00,235] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:00,235] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:00,235] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:00,235] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:00,235] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:00,235] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:01,235] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:01,235] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:01,235] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:01,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:01,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:01,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:02,236] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:02,236] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:02,236] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:02,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:02,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:02,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:03,236] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:03,236] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:03,236] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:03,236] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:03,236] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:03,236] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:04,237] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:04,237] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:04,237] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:04,237] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:04,237] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:04,237] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,237] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,237] DEBUG Next range. Current tail = 599413, current head = 5753124 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,237] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,237] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:05,237] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:05,238] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:05,851] DEBUG Trying to reset tail to 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,852] DEBUG Current tail = 599413 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:05,852] DEBUG After reset attempt tail = 599413 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:06,238] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:06,238] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:06,238] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:06,238] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:07,241] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:07,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:07,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:07,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:07,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:07,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:08,242] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:08,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:08,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:08,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:08,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:08,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:09,242] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:09,242] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:09,242] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:09,242] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:09,242] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:09,242] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:10,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:10,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:10,243] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:10,243] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:10,243] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:10,243] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:11,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:11,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:11,243] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:11,243] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:11,243] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:11,243] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:12,243] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:12,243] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:12,244] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:12,244] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:12,244] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:12,244] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:13,244] DEBUG Polling (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
[2019-06-08 13:02:13,244] DEBUG Next range. Current tail = 599413, current head = 5753125 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:13,244] DEBUG Next range from chain tracker: 599413..599412 (com.ethvm.kafka.connect.sources.web3.tracker.CanonicalChainTracker)
[2019-06-08 13:02:13,244] DEBUG Range = 599413..599412, reOrgs = [] (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:13,244] DEBUG Reorg size = 0, records size = 0 (com.ethvm.kafka.connect.sources.web3.sources.AbstractParityEntitySource)
[2019-06-08 13:02:13,244] DEBUG Polled 0 records (com.ethvm.kafka.connect.sources.web3.ParitySourceTask)
```
|
process
|
after some time fetching blocks decreases to i m submitting a feature request bug report bug report i run twice on two different ropsten envs and after sometime the fetching of new blocks decreases to and never recovers debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug trying to reset tail to com ethvm kafka connect sources tracker canonicalchaintracker debug current tail com ethvm kafka connect sources tracker canonicalchaintracker debug after reset attempt tail com ethvm kafka connect sources tracker canonicalchaintracker debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask warn workersinktask id postgres trace sink commit of offsets timed out org apache kafka connect runtime workersinktask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug fetching range next com ethvm kafka connect sources sources parityfullblocksource debug parity request com ethvm kafka connect sources sources parityfullblocksource debug total blocks fetched fetch count elapsed ms target fetch ms of target fetch trace count avg trace count com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug trying to reset tail to com ethvm kafka connect sources tracker canonicalchaintracker debug current tail com ethvm kafka connect sources tracker canonicalchaintracker debug after reset attempt tail com ethvm kafka connect sources tracker canonicalchaintracker debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask debug polling com ethvm kafka connect sources paritysourcetask debug next range current tail current head com ethvm kafka connect sources tracker canonicalchaintracker debug next range from chain tracker com ethvm kafka connect sources tracker canonicalchaintracker debug range reorgs com ethvm kafka connect sources sources abstractparityentitysource debug reorg size records size com ethvm kafka connect sources sources abstractparityentitysource debug polled records com ethvm kafka connect sources paritysourcetask
| 1
|
8,847
| 11,951,322,916
|
IssuesEvent
|
2020-04-03 16:40:25
|
MicrosoftDocs/vsts-docs
|
https://api.github.com/repos/MicrosoftDocs/vsts-docs
|
closed
|
Consuming ACR container resources
|
Pri1 devops-cicd-process/tech devops/prod
|
Hi,
I'm trying to use the first class ACR container resources in my build pipeline as a trigger to run whenever I push a new version. I'm wondering if I can use that very same container as the container for a container job.
Thank you,
Michael
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2
* Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee
* Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#resources-containers)
* Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/resources.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Consuming ACR container resources - Hi,
I'm trying to use the first class ACR container resources in my build pipeline as a trigger to run whenever I push a new version. I'm wondering if I can use that very same container as the container for a container job.
Thank you,
Michael
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: ee4ec9d0-e0d5-4fb4-7c3e-b84abfa290c2
* Version Independent ID: 3e2b80d9-30e5-0c48-49f0-4fcdfedf5eee
* Content: [Resources - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#resources-containers)
* Content Source: [docs/pipelines/process/resources.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/resources.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
consuming acr container resources hi i m trying to use the first class acr container resources in my build pipeline as a trigger to run whenever i push a new version i m wondering if i can use that very same container as the container for a container job thank you michael document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
54,059
| 13,247,209,657
|
IssuesEvent
|
2020-08-19 16:52:36
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
"west zephyr-export" dumps stack if cmake is not installed
|
area: Build System area: West bug priority: low
|
**Describe the bug**
The `west zephyr-export` command should fail gracefully if cmake is not installed, but it does not
**To Reproduce**
Steps to reproduce the behavior:
1. uninstall cmake
2. make sure no cmake user package entries exist for zephyr (maybe; optional -- that's how I tested)
3. `west zephyr-export`
Output:
```
$ west zephyr-export
FATAL ERROR: CMake is not installed or cannot be found; cannot build.
Traceback (most recent call last):
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 59, in run_cmake_and_clean_up
lines = run_cmake(['-S', str(path), '-B', str(path)],
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/zcmake.py", line 44, in run_cmake
log.die('CMake is not installed or cannot be found; cannot build.')
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/log.py", line 146, in die
sys.exit(exit_code)
SystemExit: 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/mbolivar/.virtualenvs/west-test/bin/west", line 8, in <module>
sys.exit(main())
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 780, in main
app.run(argv or sys.argv[1:])
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 106, in run
self.run_command(argv)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 339, in run_command
self.run_extension(args.command, argv)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 409, in run_extension
command.run(args, unknown, self.topdir, manifest=self.manifest)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/commands.py", line 116, in run
self.do_run(args, unknown)
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 48, in do_run
run_cmake_and_clean_up(share / 'zephyr-package' / 'cmake')
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 62, in run_cmake_and_clean_up
msg = [line for line in lines if not line.startswith('-- ')]
```
**Expected behavior**
No stack dump, just a clean error message
**Impact**
Annoyance
**Environment (please complete the following information):**
- Commit SHA or Version used: d5e1753eb6c36421a87b535e8e6cb3a9d50e06e4
|
1.0
|
"west zephyr-export" dumps stack if cmake is not installed - **Describe the bug**
The `west zephyr-export` command should fail gracefully if cmake is not installed, but it does not
**To Reproduce**
Steps to reproduce the behavior:
1. uninstall cmake
2. make sure no cmake user package entries exist for zephyr (maybe; optional -- that's how I tested)
3. `west zephyr-export`
Output:
```
$ west zephyr-export
FATAL ERROR: CMake is not installed or cannot be found; cannot build.
Traceback (most recent call last):
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 59, in run_cmake_and_clean_up
lines = run_cmake(['-S', str(path), '-B', str(path)],
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/zcmake.py", line 44, in run_cmake
log.die('CMake is not installed or cannot be found; cannot build.')
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/log.py", line 146, in die
sys.exit(exit_code)
SystemExit: 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/mbolivar/.virtualenvs/west-test/bin/west", line 8, in <module>
sys.exit(main())
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 780, in main
app.run(argv or sys.argv[1:])
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 106, in run
self.run_command(argv)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 339, in run_command
self.run_extension(args.command, argv)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/app/main.py", line 409, in run_extension
command.run(args, unknown, self.topdir, manifest=self.manifest)
File "/home/mbolivar/.virtualenvs/west-test/lib/python3.8/site-packages/west/commands.py", line 116, in run
self.do_run(args, unknown)
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 48, in do_run
run_cmake_and_clean_up(share / 'zephyr-package' / 'cmake')
File "/home/mbolivar/zephyrproject/zephyr/scripts/west_commands/export.py", line 62, in run_cmake_and_clean_up
msg = [line for line in lines if not line.startswith('-- ')]
```
**Expected behavior**
No stack dump, just a clean error message
**Impact**
Annoyance
**Environment (please complete the following information):**
- Commit SHA or Version used: d5e1753eb6c36421a87b535e8e6cb3a9d50e06e4
|
non_process
|
west zephyr export dumps stack if cmake is not installed describe the bug the west zephyr export command should fail gracefully if cmake is not installed but it does not to reproduce steps to reproduce the behavior uninstall cmake make sure no cmake user package entries exist for zephyr maybe optional that s how i tested west zephyr export output west zephyr export fatal error cmake is not installed or cannot be found cannot build traceback most recent call last file home mbolivar zephyrproject zephyr scripts west commands export py line in run cmake and clean up lines run cmake file home mbolivar zephyrproject zephyr scripts west commands zcmake py line in run cmake log die cmake is not installed or cannot be found cannot build file home mbolivar virtualenvs west test lib site packages west log py line in die sys exit exit code systemexit during handling of the above exception another exception occurred traceback most recent call last file home mbolivar virtualenvs west test bin west line in sys exit main file home mbolivar virtualenvs west test lib site packages west app main py line in main app run argv or sys argv file home mbolivar virtualenvs west test lib site packages west app main py line in run self run command argv file home mbolivar virtualenvs west test lib site packages west app main py line in run command self run extension args command argv file home mbolivar virtualenvs west test lib site packages west app main py line in run extension command run args unknown self topdir manifest self manifest file home mbolivar virtualenvs west test lib site packages west commands py line in run self do run args unknown file home mbolivar zephyrproject zephyr scripts west commands export py line in do run run cmake and clean up share zephyr package cmake file home mbolivar zephyrproject zephyr scripts west commands export py line in run cmake and clean up msg expected behavior no stack dump just a clean error message impact annoyance environment please complete the following information commit sha or version used
| 0
|
363,087
| 10,737,504,543
|
IssuesEvent
|
2019-10-29 13:14:04
|
grpc/grpc
|
https://api.github.com/repos/grpc/grpc
|
opened
|
Local credentials fail on Windows (win_get_fd always returns -1)
|
kind/bug priority/P2
|
<!--
This form is for bug reports and feature requests ONLY!
For general questions and troubleshooting, please ask/look for answers here:
- grpc.io mailing list: https://groups.google.com/forum/#!forum/grpc-io
- StackOverflow, with "grpc" tag: https://stackoverflow.com/questions/tagged/grpc
Issues specific to *grpc-java*, *grpc-go*, *grpc-node*, *grpc-dart*, *grpc-web* should be created in the repository they belong to (e.g. https://github.com/grpc/grpc-LANGUAGE/issues/new)
-->
### What version of gRPC and what language are you using?
v1.24.3 (Bug exists also on master), C++
### What operating system (Linux, Windows,...) and version?
Windows 10
### What runtime / compiler are you using (e.g. python version or version of gcc)
VS 2017
### What did you do?
I used grpc_impl::experimental::LocalCredentails and LocalServerCredentials in a client and a server I wrote for Windows.
### What did you expect to see?
The client executes an RPC command successfully on the server.
### What did you see instead?
The server and the client reject each other. Status error returned from the stub == 14.
During debugging, I found that win_get_fd implementation (src/core/lib/iomgr/tcp_windows.cc) always returns -1. This led to the failure of local_check_peer (src/core/lib/security/security_connector/local/local_security_connector.cc) to get the fd socket name. Because of that, local credentials are not working under Windows.
I opened a PR that solves this issue by fixing win_get_fd to return the correct socket fd.
|
1.0
|
Local credentials fail on Windows (win_get_fd always returns -1) - <!--
This form is for bug reports and feature requests ONLY!
For general questions and troubleshooting, please ask/look for answers here:
- grpc.io mailing list: https://groups.google.com/forum/#!forum/grpc-io
- StackOverflow, with "grpc" tag: https://stackoverflow.com/questions/tagged/grpc
Issues specific to *grpc-java*, *grpc-go*, *grpc-node*, *grpc-dart*, *grpc-web* should be created in the repository they belong to (e.g. https://github.com/grpc/grpc-LANGUAGE/issues/new)
-->
### What version of gRPC and what language are you using?
v1.24.3 (Bug exists also on master), C++
### What operating system (Linux, Windows,...) and version?
Windows 10
### What runtime / compiler are you using (e.g. python version or version of gcc)
VS 2017
### What did you do?
I used grpc_impl::experimental::LocalCredentails and LocalServerCredentials in a client and a server I wrote for Windows.
### What did you expect to see?
The client executes an RPC command successfully on the server.
### What did you see instead?
The server and the client reject each other. Status error returned from the stub == 14.
During debugging, I found that win_get_fd implementation (src/core/lib/iomgr/tcp_windows.cc) always returns -1. This led to the failure of local_check_peer (src/core/lib/security/security_connector/local/local_security_connector.cc) to get the fd socket name. Because of that, local credentials are not working under Windows.
I opened a PR that solves this issue by fixing win_get_fd to return the correct socket fd.
|
non_process
|
local credentials fail on windows win get fd always returns this form is for bug reports and feature requests only for general questions and troubleshooting please ask look for answers here grpc io mailing list stackoverflow with grpc tag issues specific to grpc java grpc go grpc node grpc dart grpc web should be created in the repository they belong to e g what version of grpc and what language are you using bug exists also on master c what operating system linux windows and version windows what runtime compiler are you using e g python version or version of gcc vs what did you do i used grpc impl experimental localcredentails and localservercredentials in a client and a server i wrote for windows what did you expect to see the client executes an rpc command successfully on the server what did you see instead the server and the client reject each other status error returned from the stub during debugging i found that win get fd implementation src core lib iomgr tcp windows cc always returns this led to the failure of local check peer src core lib security security connector local local security connector cc to get the fd socket name because of that local credentials are not working under windows i opened a pr that solves this issue by fixing win get fd to return the correct socket fd
| 0
|
3,847
| 3,257,769,813
|
IssuesEvent
|
2015-10-20 19:18:07
|
mapbox/mapbox-gl-native
|
https://api.github.com/repos/mapbox/mapbox-gl-native
|
closed
|
test CI on Bitrise
|
build iOS OS X
|
Though Bitrise [doesn't yet support public apps](https://bitrise.uservoice.com/forums/235233-general/suggestions/8795035-allow-public-or-publicly-viewable-apps), that shouldn't block us from trying it out and maybe offloading the long-running iOS Travis jobs there.
Also unclear if we could build for OS X there, since the host machine is OS X. Depends on what SDKs they provide.
|
1.0
|
test CI on Bitrise - Though Bitrise [doesn't yet support public apps](https://bitrise.uservoice.com/forums/235233-general/suggestions/8795035-allow-public-or-publicly-viewable-apps), that shouldn't block us from trying it out and maybe offloading the long-running iOS Travis jobs there.
Also unclear if we could build for OS X there, since the host machine is OS X. Depends on what SDKs they provide.
|
non_process
|
test ci on bitrise though bitrise that shouldn t block us from trying it out and maybe offloading the long running ios travis jobs there also unclear if we could build for os x there since the host machine is os x depends on what sdks they provide
| 0
|
14,677
| 17,792,661,966
|
IssuesEvent
|
2021-08-31 18:04:20
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Can I import parameters from templates?
|
doc-enhancement devops/prod devops-cicd-process/tech needs-sme
|
Hi there,
Is it possible to do like
```
parameters:
- template: ./template.yml
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 790318bb-8220-3241-4ca7-73351074492f
* Version Independent ID: db1da9db-3694-779b-17aa-1ed67fcecf86
* Content: [Use runtime and type-safe parameters - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script)
* Content Source: [docs/pipelines/process/runtime-parameters.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/runtime-parameters.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Can I import parameters from templates? - Hi there,
Is it possible to do like
```
parameters:
- template: ./template.yml
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 790318bb-8220-3241-4ca7-73351074492f
* Version Independent ID: db1da9db-3694-779b-17aa-1ed67fcecf86
* Content: [Use runtime and type-safe parameters - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script)
* Content Source: [docs/pipelines/process/runtime-parameters.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/runtime-parameters.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
can i import parameters from templates hi there is it possible to do like parameters template template yml document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
69,442
| 9,303,655,387
|
IssuesEvent
|
2019-03-24 19:06:31
|
avalonmediasystem/avalon
|
https://api.github.com/repos/avalonmediasystem/avalon
|
closed
|
Documentation Updates
|
documentation in progress sprint 173 sprint 174
|
### Description
Two Avalon documentation pages are out of date and need to be updated: _Supported Browsers and Platforms_ and _Known Issues_.
### Done Looks Like
- [x] [Supported Browsers and Platforms](https://wiki.dlib.indiana.edu/display/VarVideo/Supported+Browsers+and+Platforms) page updated for 6.4.5
- [x] [Known Issues](https://wiki.dlib.indiana.edu/display/VarVideo/Known+Issues) page updated with most recent bug/issues accounted for
|
1.0
|
Documentation Updates - ### Description
Two Avalon documentation pages are out of date and need to be updated: _Supported Browsers and Platforms_ and _Known Issues_.
### Done Looks Like
- [x] [Supported Browsers and Platforms](https://wiki.dlib.indiana.edu/display/VarVideo/Supported+Browsers+and+Platforms) page updated for 6.4.5
- [x] [Known Issues](https://wiki.dlib.indiana.edu/display/VarVideo/Known+Issues) page updated with most recent bug/issues accounted for
|
non_process
|
documentation updates description two avalon documentation pages are out of date and need to be updated supported browsers and platforms and known issues done looks like page updated for page updated with most recent bug issues accounted for
| 0
|
8,301
| 11,463,299,224
|
IssuesEvent
|
2020-02-07 15:45:16
|
wordpress-mobile/gutenberg-mobile
|
https://api.github.com/repos/wordpress-mobile/gutenberg-mobile
|
closed
|
[Android] - Upload retry is broken
|
Android Media bug release-process
|
**Describe the bug**
The retry option for failed uploads in media blocks (image, media & text, etc.) does not work.
**To Reproduce**
Steps to reproduce the behavior:
• Start a new post and add an image block
• Add image from "Choose from device" with airplane mode on (observe failure message)
• Turn airplane mode off and tap image
• Tap retry upload
• :x: Image dims, but does not reupload.. or at least progress does not seem to show
**Expected behavior**
Image should retry uploading, showing a progress indicator.
**Smartphone:**
- Device: Pixel 3a
- OS: Android X
**Additional context**
I encountered this on `develop` and the release branch for `1.22.0`.
I believe this may be due to some recent refactoring work, though I haven't pinpointed the source of the issue. It may be related to code near this path: https://github.com/wordpress-mobile/WordPress-Android/blob/gutenberg/integrate_release_1.22.0/WordPress/src/main/java/org/wordpress/android/ui/posts/editor/media/UploadMediaUseCase.kt#L26 but I haven't thoroughly investigated this. It seems the passed lambda is not invoked, and the `UploadService` intent is never launched (https://github.com/wordpress-mobile/WordPress-Android/blob/gutenberg/integrate_release_1.22.0/WordPress/src/main/java/org/wordpress/android/ui/uploads/UploadService.java#L388).
|
1.0
|
[Android] - Upload retry is broken - **Describe the bug**
The retry option for failed uploads in media blocks (image, media & text, etc.) does not work.
**To Reproduce**
Steps to reproduce the behavior:
• Start a new post and add an image block
• Add image from "Choose from device" with airplane mode on (observe failure message)
• Turn airplane mode off and tap image
• Tap retry upload
• :x: Image dims, but does not reupload.. or at least progress does not seem to show
**Expected behavior**
Image should retry uploading, showing a progress indicator.
**Smartphone:**
- Device: Pixel 3a
- OS: Android X
**Additional context**
I encountered this on `develop` and the release branch for `1.22.0`.
I believe this may be due to some recent refactoring work, though I haven't pinpointed the source of the issue. It may be related to code near this path: https://github.com/wordpress-mobile/WordPress-Android/blob/gutenberg/integrate_release_1.22.0/WordPress/src/main/java/org/wordpress/android/ui/posts/editor/media/UploadMediaUseCase.kt#L26 but I haven't thoroughly investigated this. It seems the passed lambda is not invoked, and the `UploadService` intent is never launched (https://github.com/wordpress-mobile/WordPress-Android/blob/gutenberg/integrate_release_1.22.0/WordPress/src/main/java/org/wordpress/android/ui/uploads/UploadService.java#L388).
|
process
|
upload retry is broken describe the bug the retry option for failed uploads in media blocks image media text etc does not work to reproduce steps to reproduce the behavior • start a new post and add an image block • add image from choose from device with airplane mode on observe failure message • turn airplane mode off and tap image • tap retry upload • x image dims but does not reupload or at least progress does not seem to show expected behavior image should retry uploading showing a progress indicator smartphone device pixel os android x additional context i encountered this on develop and the release branch for i believe this may be due to some recent refactoring work though i haven t pinpointed the source of the issue it may be related to code near this path but i haven t thoroughly investigated this it seems the passed lambda is not invoked and the uploadservice intent is never launched
| 1
|
349,803
| 10,473,613,499
|
IssuesEvent
|
2019-09-23 12:59:58
|
threefoldtech/jumpscaleX_core
|
https://api.github.com/repos/threefoldtech/jumpscaleX_core
|
opened
|
zeroCI: Test Runner front-end
|
priority_major
|
After running the tests on zeroCI, results need to have a good view.
#### We have 2 scenarios for running tests:
1- Run tests for every commit on a branch for a repo.
2- Run testsuite for a specific project like Builders should be run nightly.
#### Tests Result types:
1- Tests that use JUnit should have the status and result for every testcase.
2- Tests that not use JUnit should have the overall result as log.
#### zeroCI APIs:
`/api/`: contains the names of repos and testsuites running on the CI `{"repos": [], projects: []}`
`/api/repos/<repo_name>`: contains the branches running on this `repo_name`.
`/api/repos/<repo_name>?branch=<branch_name>`: contains a simple info about every commit had run on `branch_name`.
`/api/repos/<repo_name>?branch=<branch_name>&id=<test_id>`: contains the result of the test, `test_id` can be obtained from the previous api.
`/api/projects/<project_name>`: contains the test runs on this `project_name`.
`/api/projects/<project_name>?id=<test_id>`: contains the test result of the test, `test_id` can be obtained from the previous api.
|
1.0
|
zeroCI: Test Runner front-end - After running the tests on zeroCI, results need to have a good view.
#### We have 2 scenarios for running tests:
1- Run tests for every commit on a branch for a repo.
2- Run testsuite for a specific project like Builders should be run nightly.
#### Tests Result types:
1- Tests that use JUnit should have the status and result for every testcase.
2- Tests that not use JUnit should have the overall result as log.
#### zeroCI APIs:
`/api/`: contains the names of repos and testsuites running on the CI `{"repos": [], projects: []}`
`/api/repos/<repo_name>`: contains the branches running on this `repo_name`.
`/api/repos/<repo_name>?branch=<branch_name>`: contains a simple info about every commit had run on `branch_name`.
`/api/repos/<repo_name>?branch=<branch_name>&id=<test_id>`: contains the result of the test, `test_id` can be obtained from the previous api.
`/api/projects/<project_name>`: contains the test runs on this `project_name`.
`/api/projects/<project_name>?id=<test_id>`: contains the test result of the test, `test_id` can be obtained from the previous api.
|
non_process
|
zeroci test runner front end after running the tests on zeroci results need to have a good view we have scenarios for running tests run tests for every commit on a branch for a repo run testsuite for a specific project like builders should be run nightly tests result types tests that use junit should have the status and result for every testcase tests that not use junit should have the overall result as log zeroci apis api contains the names of repos and testsuites running on the ci repos projects api repos contains the branches running on this repo name api repos branch contains a simple info about every commit had run on branch name api repos branch id contains the result of the test test id can be obtained from the previous api api projects contains the test runs on this project name api projects id contains the test result of the test test id can be obtained from the previous api
| 0
|
14,200
| 17,100,411,864
|
IssuesEvent
|
2021-07-09 10:23:39
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
opened
|
Postgres dollar quoting ($$) causes problems when using with variables
|
.Backend Database/Postgres Priority:P3 Querying/Parameters & Variables Querying/Processor Type:Bug
|
**Describe the bug**
When using advanced queries with [dollar quoting](https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-SYNTAX-DOLLAR-QUOTING) on Postgres and with variables, then Metabase gets confused and incorrectly substitutes the dollar quoting, when replacing the variable.
**To Reproduce**
1. Native query > Postgres - where {{filter}} is Text filter
```
-- enable crosstab extension if you don't have it first: CREATE EXTENSION IF NOT EXISTS tablefunc;
SELECT *
FROM crosstab($$
select 'A' as id, '1' as cat, {{filter}} as var
$$)
AS ct (id text, cat text, var text);
```
2. Write something in the filter widget and run the query - fails with `The column index is out of range: 1, number of columns: 0.`
<details><summary>Full stacktrace</summary>
```
2021-07-09 11:52:16,841 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 22,
:started_at #t "2021-07-09T11:52:16.463664+02:00[Europe/Copenhagen]",
:state "22023",
:json_query
{:type "native",
:native
{:query
"SELECT *\nFROM crosstab($$\n select 'A' as id, '1' as cat, {{filter}} as var\n$$)\nAS ct (id text, cat text, var text);",
:template-tags
{:filter {:id "d27e6219-eb47-05a7-1f5b-d33e06f41861", :name "filter", :display-name "Filter", :type "text"}}},
:database 22,
:parameters [{:type "category", :value "test", :target ["variable" ["template-tag" "filter"]]}],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:status :failed,
:class org.postgresql.util.PSQLException,
:stacktrace
["org.postgresql.core.v3.SimpleParameterList.bind(SimpleParameterList.java:69)"
"org.postgresql.core.v3.SimpleParameterList.setStringParameter(SimpleParameterList.java:132)"
"org.postgresql.jdbc.PgPreparedStatement.bindString(PgPreparedStatement.java:1060)"
"org.postgresql.jdbc.PgPreparedStatement.setString(PgPreparedStatement.java:356)"
"org.postgresql.jdbc.PgPreparedStatement.setString(PgPreparedStatement.java:342)"
"org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:950)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.setObject(NewProxyPreparedStatement.java:1008)"
"--> driver.sql_jdbc.execute$set_object.invokeStatic(execute.clj:204)"
"driver.sql_jdbc.execute$set_object.invoke(execute.clj:201)"
"driver.sql_jdbc.execute$fn__79653.invokeStatic(execute.clj:213)"
"driver.sql_jdbc.execute$fn__79653.invoke(execute.clj:211)"
"driver.sql_jdbc.execute$set_parameters_BANG_$fn__79671.invoke(execute.clj:257)"
"driver.sql_jdbc.execute$set_parameters_BANG_.invokeStatic(execute.clj:253)"
"driver.sql_jdbc.execute$set_parameters_BANG_.invoke(execute.clj:249)"
"driver.sql_jdbc.execute$fn__79675.invokeStatic(execute.clj:272)"
"driver.sql_jdbc.execute$fn__79675.invoke(execute.clj:260)"
"driver.sql_jdbc.execute$prepared_statement_STAR_.invokeStatic(execute.clj:302)"
"driver.sql_jdbc.execute$prepared_statement_STAR_.invoke(execute.clj:299)"
"driver.sql_jdbc.execute$statement_or_prepared_statement.invokeStatic(execute.clj:326)"
"driver.sql_jdbc.execute$statement_or_prepared_statement.invoke(execute.clj:323)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:466)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:453)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:462)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:453)"
"driver.sql_jdbc$fn__81350.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__81350.invoke(sql_jdbc.clj:52)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47664.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__46778.invoke(check_features.clj:39)"
"query_processor.middleware.limit$limit$fn__47650.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__46230.invoke(cache.clj:211)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__47910.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__49840.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45349.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41497.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__41372.invoke(annotate.clj:605)"
"query_processor.middleware.permissions$check_query_permissions$fn__46650.invoke(permissions.clj:81)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48768.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46851.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49067.invoke(resolve_joined_fields.clj:102)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__49380.invoke(resolve_joins.clj:171)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44925.invoke(add_implicit_joins.clj:190)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47614.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__47595.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44219.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46917.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45736.invoke(binning.clj:227)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46453.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44574.invoke(add_dimension_projections.clj:312)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44803.invoke(add_implicit_clauses.clj:147)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__49789.invoke(upgrade_field_literals.clj:40)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45088.invoke(add_source_metadata.clj:123)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__48942.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45296.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46500.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48750.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46552.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__47301.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__45097.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49742.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48953$fn__48957.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48953.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47541.invoke(fetch_source_query.clj:274)"
"query_processor.middleware.store$initialize_store$fn__49751$fn__49752.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__49751.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__49796.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47677.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44943.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49727.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46794.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__48839.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46734.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___38051$thunk__38052.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___38051.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___38060$fn__38063.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___38060.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:241)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:237)"
"query_processor$fn__49886$process_query_and_save_execution_BANG___49895$fn__49898.invoke(query_processor.clj:253)"
"query_processor$fn__49886$process_query_and_save_execution_BANG___49895.invoke(query_processor.clj:245)"
"query_processor$fn__49930$process_query_and_save_with_max_results_constraints_BANG___49939$fn__49942.invoke(query_processor.clj:265)"
"query_processor$fn__49930$process_query_and_save_with_max_results_constraints_BANG___49939.invoke(query_processor.clj:258)"
"api.dataset$run_query_async$fn__56143.invoke(dataset.clj:56)"
"query_processor.streaming$streaming_response_STAR_$fn__56122$fn__56123.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__56122.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16071.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "The column index is out of range: 1, number of columns: 0.",
:row_count 0,
:running_time 0,
:data {:rows [], :cols []}}
```
</details>
Possible workaround - see [forum topics](https://discourse.metabase.com/t/getting-the-column-index-is-out-of-range-error-when-adding-where-clause-with-text-variable/15647) for more descriptive workaround:
```
SELECT *
FROM crosstab(concat(
'select ''A'' as id, ''1'' as cat, ',
'''', {{filter}}, '''',
' as var'
))
AS ct (id text, cat text, var text);
```
**Information about your Metabase Installation:**
Tested 0.37.8 thru 0.40.0
**Severity**
Only giving P3, since there's a workaround, though makes the SQL more difficult to handle - close P2.
**Additional context**
Likely related to #10257
|
1.0
|
Postgres dollar quoting ($$) causes problems when using with variables - **Describe the bug**
When using advanced queries with [dollar quoting](https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-SYNTAX-DOLLAR-QUOTING) on Postgres and with variables, then Metabase gets confused and incorrectly substitutes the dollar quoting, when replacing the variable.
**To Reproduce**
1. Native query > Postgres - where {{filter}} is Text filter
```
-- enable crosstab extension if you don't have it first: CREATE EXTENSION IF NOT EXISTS tablefunc;
SELECT *
FROM crosstab($$
select 'A' as id, '1' as cat, {{filter}} as var
$$)
AS ct (id text, cat text, var text);
```
2. Write something in the filter widget and run the query - fails with `The column index is out of range: 1, number of columns: 0.`
<details><summary>Full stacktrace</summary>
```
2021-07-09 11:52:16,841 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 22,
:started_at #t "2021-07-09T11:52:16.463664+02:00[Europe/Copenhagen]",
:state "22023",
:json_query
{:type "native",
:native
{:query
"SELECT *\nFROM crosstab($$\n select 'A' as id, '1' as cat, {{filter}} as var\n$$)\nAS ct (id text, cat text, var text);",
:template-tags
{:filter {:id "d27e6219-eb47-05a7-1f5b-d33e06f41861", :name "filter", :display-name "Filter", :type "text"}}},
:database 22,
:parameters [{:type "category", :value "test", :target ["variable" ["template-tag" "filter"]]}],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:status :failed,
:class org.postgresql.util.PSQLException,
:stacktrace
["org.postgresql.core.v3.SimpleParameterList.bind(SimpleParameterList.java:69)"
"org.postgresql.core.v3.SimpleParameterList.setStringParameter(SimpleParameterList.java:132)"
"org.postgresql.jdbc.PgPreparedStatement.bindString(PgPreparedStatement.java:1060)"
"org.postgresql.jdbc.PgPreparedStatement.setString(PgPreparedStatement.java:356)"
"org.postgresql.jdbc.PgPreparedStatement.setString(PgPreparedStatement.java:342)"
"org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:950)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.setObject(NewProxyPreparedStatement.java:1008)"
"--> driver.sql_jdbc.execute$set_object.invokeStatic(execute.clj:204)"
"driver.sql_jdbc.execute$set_object.invoke(execute.clj:201)"
"driver.sql_jdbc.execute$fn__79653.invokeStatic(execute.clj:213)"
"driver.sql_jdbc.execute$fn__79653.invoke(execute.clj:211)"
"driver.sql_jdbc.execute$set_parameters_BANG_$fn__79671.invoke(execute.clj:257)"
"driver.sql_jdbc.execute$set_parameters_BANG_.invokeStatic(execute.clj:253)"
"driver.sql_jdbc.execute$set_parameters_BANG_.invoke(execute.clj:249)"
"driver.sql_jdbc.execute$fn__79675.invokeStatic(execute.clj:272)"
"driver.sql_jdbc.execute$fn__79675.invoke(execute.clj:260)"
"driver.sql_jdbc.execute$prepared_statement_STAR_.invokeStatic(execute.clj:302)"
"driver.sql_jdbc.execute$prepared_statement_STAR_.invoke(execute.clj:299)"
"driver.sql_jdbc.execute$statement_or_prepared_statement.invokeStatic(execute.clj:326)"
"driver.sql_jdbc.execute$statement_or_prepared_statement.invoke(execute.clj:323)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:466)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:453)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:462)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:453)"
"driver.sql_jdbc$fn__81350.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__81350.invoke(sql_jdbc.clj:52)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47664.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__46778.invoke(check_features.clj:39)"
"query_processor.middleware.limit$limit$fn__47650.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__46230.invoke(cache.clj:211)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__47910.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__49840.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45349.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41497.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__41372.invoke(annotate.clj:605)"
"query_processor.middleware.permissions$check_query_permissions$fn__46650.invoke(permissions.clj:81)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48768.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46851.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49067.invoke(resolve_joined_fields.clj:102)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__49380.invoke(resolve_joins.clj:171)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44925.invoke(add_implicit_joins.clj:190)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47614.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__47595.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44219.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46917.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45736.invoke(binning.clj:227)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46453.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44574.invoke(add_dimension_projections.clj:312)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44803.invoke(add_implicit_clauses.clj:147)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__49789.invoke(upgrade_field_literals.clj:40)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45088.invoke(add_source_metadata.clj:123)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__48942.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45296.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46500.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48750.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46552.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__47301.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__45097.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49742.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48953$fn__48957.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48953.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47541.invoke(fetch_source_query.clj:274)"
"query_processor.middleware.store$initialize_store$fn__49751$fn__49752.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__49751.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__49796.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47677.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44943.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49727.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46794.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__48839.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46734.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___38051$thunk__38052.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___38051.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___38060$fn__38063.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___38060.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:241)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:237)"
"query_processor$fn__49886$process_query_and_save_execution_BANG___49895$fn__49898.invoke(query_processor.clj:253)"
"query_processor$fn__49886$process_query_and_save_execution_BANG___49895.invoke(query_processor.clj:245)"
"query_processor$fn__49930$process_query_and_save_with_max_results_constraints_BANG___49939$fn__49942.invoke(query_processor.clj:265)"
"query_processor$fn__49930$process_query_and_save_with_max_results_constraints_BANG___49939.invoke(query_processor.clj:258)"
"api.dataset$run_query_async$fn__56143.invoke(dataset.clj:56)"
"query_processor.streaming$streaming_response_STAR_$fn__56122$fn__56123.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__56122.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16071.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "The column index is out of range: 1, number of columns: 0.",
:row_count 0,
:running_time 0,
:data {:rows [], :cols []}}
```
</details>
Possible workaround - see [forum topics](https://discourse.metabase.com/t/getting-the-column-index-is-out-of-range-error-when-adding-where-clause-with-text-variable/15647) for more descriptive workaround:
```
SELECT *
FROM crosstab(concat(
'select ''A'' as id, ''1'' as cat, ',
'''', {{filter}}, '''',
' as var'
))
AS ct (id text, cat text, var text);
```
**Information about your Metabase Installation:**
Tested 0.37.8 thru 0.40.0
**Severity**
Only giving P3, since there's a workaround, though makes the SQL more difficult to handle - close P2.
**Additional context**
Likely related to #10257
|
process
|
postgres dollar quoting causes problems when using with variables describe the bug when using advanced queries with on postgres and with variables then metabase gets confused and incorrectly substitutes the dollar quoting when replacing the variable to reproduce native query postgres where filter is text filter enable crosstab extension if you don t have it first create extension if not exists tablefunc select from crosstab select a as id as cat filter as var as ct id text cat text var text write something in the filter widget and run the query fails with the column index is out of range number of columns full stacktrace error middleware catch exceptions error processing query null database id started at t state json query type native native query select nfrom crosstab n select a as id as cat filter as var n nas ct id text cat text var text template tags filter id name filter display name filter type text database parameters middleware js int to string true add default userland constraints true status failed class org postgresql util psqlexception stacktrace org postgresql core simpleparameterlist bind simpleparameterlist java org postgresql core simpleparameterlist setstringparameter simpleparameterlist java org postgresql jdbc pgpreparedstatement bindstring pgpreparedstatement java org postgresql jdbc pgpreparedstatement setstring pgpreparedstatement java org postgresql jdbc pgpreparedstatement setstring pgpreparedstatement java org postgresql jdbc pgpreparedstatement setobject pgpreparedstatement java com mchange impl newproxypreparedstatement setobject newproxypreparedstatement java driver sql jdbc execute set object invokestatic execute clj driver sql jdbc execute set object invoke execute clj driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute set parameters bang fn invoke execute clj driver sql jdbc execute set parameters bang invokestatic execute clj driver sql jdbc execute set parameters bang invoke execute clj driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute prepared statement star invokestatic execute clj driver sql jdbc execute prepared statement star invoke execute clj driver sql jdbc execute statement or prepared statement invokestatic execute clj driver sql jdbc execute statement or prepared statement invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error the column index is out of range number of columns row count running time data rows cols possible workaround see for more descriptive workaround select from crosstab concat select a as id as cat filter as var as ct id text cat text var text information about your metabase installation tested thru severity only giving since there s a workaround though makes the sql more difficult to handle close additional context likely related to
| 1
|
136,160
| 19,715,579,782
|
IssuesEvent
|
2022-01-13 10:38:44
|
HorizenOfficial/ginger-lib
|
https://api.github.com/repos/HorizenOfficial/ginger-lib
|
closed
|
Restructuring pairings
|
enhancement sw design
|
By now the pairing engines for different types of pairing-friendly curves are implemented independently, resulting in five pieces of very similar code for the Ate pairing evaluation:
- `algebra/src/curves/models/bls12/mod.rs`, ZEXE's generic implementation for BLS12 curves in affine coordinates,
- `algebra/src/curves/mnt6/mod.rs`, ZEXE's specific implementation for the MNT6-298,
- `algebra/src/curves/sw6/mod.rs`, another implementation for the Cocks-Pinch curve-782 of embedding degree 6 (ZEXE's outer curve with respect to the BLS12-377),
- and our recent implementations for generic MNT4 curves, `algebra/src/curves/models/mnt4/mod.rs`, and MNT6 curves, `algebra/src/curves/models/mnt6/mod.rs`.
I propose a different level of abstraction, introducing `twist2Ate` and `twist6Ate` as pairing types according to the two representations of G2 as found in the above implementations: either as a subgroup of a quadratic twist (as for MNT4/6, mnt6, sw6) or sextic twist (BLS12) of the curve.
Both `twist2Ate` and `twist6Ate` need to allow the base field for the twist E' to be a general extension field of the same characteristic as the one for E - but the operations for the evaluation of the Ate paring are generic, i.e. independent of the degree of the base field for E'.
Such abstraction makes it possible to reuse these two types of pairing types, and to aggregate pairing evaluation code in an own folder (e.g. `algebra/src/pairings/` ) separated from `algebra/src/curves`, which contain also non-pairing-friendly curves such as the JubJub.
In my opinion such a structure is more clear than the present. Moreover, it makes it is easier to find all the pairing code when extending the lib by further pairing types (which use cubic twists, e.g.).
|
1.0
|
Restructuring pairings - By now the pairing engines for different types of pairing-friendly curves are implemented independently, resulting in five pieces of very similar code for the Ate pairing evaluation:
- `algebra/src/curves/models/bls12/mod.rs`, ZEXE's generic implementation for BLS12 curves in affine coordinates,
- `algebra/src/curves/mnt6/mod.rs`, ZEXE's specific implementation for the MNT6-298,
- `algebra/src/curves/sw6/mod.rs`, another implementation for the Cocks-Pinch curve-782 of embedding degree 6 (ZEXE's outer curve with respect to the BLS12-377),
- and our recent implementations for generic MNT4 curves, `algebra/src/curves/models/mnt4/mod.rs`, and MNT6 curves, `algebra/src/curves/models/mnt6/mod.rs`.
I propose a different level of abstraction, introducing `twist2Ate` and `twist6Ate` as pairing types according to the two representations of G2 as found in the above implementations: either as a subgroup of a quadratic twist (as for MNT4/6, mnt6, sw6) or sextic twist (BLS12) of the curve.
Both `twist2Ate` and `twist6Ate` need to allow the base field for the twist E' to be a general extension field of the same characteristic as the one for E - but the operations for the evaluation of the Ate paring are generic, i.e. independent of the degree of the base field for E'.
Such abstraction makes it possible to reuse these two types of pairing types, and to aggregate pairing evaluation code in an own folder (e.g. `algebra/src/pairings/` ) separated from `algebra/src/curves`, which contain also non-pairing-friendly curves such as the JubJub.
In my opinion such a structure is more clear than the present. Moreover, it makes it is easier to find all the pairing code when extending the lib by further pairing types (which use cubic twists, e.g.).
|
non_process
|
restructuring pairings by now the pairing engines for different types of pairing friendly curves are implemented independently resulting in five pieces of very similar code for the ate pairing evaluation algebra src curves models mod rs zexe s generic implementation for curves in affine coordinates algebra src curves mod rs zexe s specific implementation for the algebra src curves mod rs another implementation for the cocks pinch curve of embedding degree zexe s outer curve with respect to the and our recent implementations for generic curves algebra src curves models mod rs and curves algebra src curves models mod rs i propose a different level of abstraction introducing and as pairing types according to the two representations of as found in the above implementations either as a subgroup of a quadratic twist as for or sextic twist of the curve both and need to allow the base field for the twist e to be a general extension field of the same characteristic as the one for e but the operations for the evaluation of the ate paring are generic i e independent of the degree of the base field for e such abstraction makes it possible to reuse these two types of pairing types and to aggregate pairing evaluation code in an own folder e g algebra src pairings separated from algebra src curves which contain also non pairing friendly curves such as the jubjub in my opinion such a structure is more clear than the present moreover it makes it is easier to find all the pairing code when extending the lib by further pairing types which use cubic twists e g
| 0
|
97,215
| 3,987,145,701
|
IssuesEvent
|
2016-05-09 00:46:40
|
nvs/gem
|
https://api.github.com/repos/nvs/gem
|
opened
|
Multiboard disappears if game mode selection is too slow
|
Priority: Soon Status: Not Started Type: Bug
|
Annoying to say the least. This is fixable, for sure.
Can also consider, at a later point, increasing the amount of information the multiboard can display. Perhaps some sort of toggle.
|
1.0
|
Multiboard disappears if game mode selection is too slow - Annoying to say the least. This is fixable, for sure.
Can also consider, at a later point, increasing the amount of information the multiboard can display. Perhaps some sort of toggle.
|
non_process
|
multiboard disappears if game mode selection is too slow annoying to say the least this is fixable for sure can also consider at a later point increasing the amount of information the multiboard can display perhaps some sort of toggle
| 0
|
21,576
| 29,933,261,682
|
IssuesEvent
|
2023-06-22 10:58:52
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
Complement information about the use of pipes in GDAL creation options
|
Processing Alg
|
Are they the only ones or would that concern _also_ [these ones](https://github.com/qgis/QGIS-Documentation/pull/6972/files)? And maybe there is room for a generic information at https://docs.qgis.org/testing/en/docs/user_manual/processing/modeler.html#definition-of-inputs?
_Originalmente postado por @DelazJ em https://github.com/qgis/QGIS-Documentation/pull/8095#issuecomment-1465512415_
|
1.0
|
Complement information about the use of pipes in GDAL creation options - Are they the only ones or would that concern _also_ [these ones](https://github.com/qgis/QGIS-Documentation/pull/6972/files)? And maybe there is room for a generic information at https://docs.qgis.org/testing/en/docs/user_manual/processing/modeler.html#definition-of-inputs?
_Originalmente postado por @DelazJ em https://github.com/qgis/QGIS-Documentation/pull/8095#issuecomment-1465512415_
|
process
|
complement information about the use of pipes in gdal creation options are they the only ones or would that concern also and maybe there is room for a generic information at originalmente postado por delazj em
| 1
|
13,029
| 15,381,071,046
|
IssuesEvent
|
2021-03-02 22:04:00
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
opened
|
internal/kokoro: collect packages to apidiff with a regexp, not a list
|
type: process
|
`check_incompat_changes.sh` only diffs those packages explicitly listed [here](https://github.com/googleapis/google-cloud-go/blob/master/internal/kokoro/check_incompat_changes.sh#L37-L38). This is not scalable and it is currently out of date. We should use a regexp of some sort to list out all apiv1 packages for diffing.
|
1.0
|
internal/kokoro: collect packages to apidiff with a regexp, not a list - `check_incompat_changes.sh` only diffs those packages explicitly listed [here](https://github.com/googleapis/google-cloud-go/blob/master/internal/kokoro/check_incompat_changes.sh#L37-L38). This is not scalable and it is currently out of date. We should use a regexp of some sort to list out all apiv1 packages for diffing.
|
process
|
internal kokoro collect packages to apidiff with a regexp not a list check incompat changes sh only diffs those packages explicitly listed this is not scalable and it is currently out of date we should use a regexp of some sort to list out all packages for diffing
| 1
|
97,128
| 28,105,106,796
|
IssuesEvent
|
2023-03-30 23:26:36
|
webstudio-is/webstudio-builder
|
https://api.github.com/repos/webstudio-is/webstudio-builder
|
opened
|
Button component is a different size than link block component, even with the same styling applied
|
type:bug area:builder prio:1
| ERROR: type should be string, got "\r\nhttps://user-images.githubusercontent.com/122043657/228986570-e9c673a3-d3dd-42ae-9190-97984a1dc127.mp4\r\n\r\nI've made an example in the debugging page in the builder"
|
1.0
|
Button component is a different size than link block component, even with the same styling applied -
https://user-images.githubusercontent.com/122043657/228986570-e9c673a3-d3dd-42ae-9190-97984a1dc127.mp4
I've made an example in the debugging page in the builder
|
non_process
|
button component is a different size than link block component even with the same styling applied i ve made an example in the debugging page in the builder
| 0
|
8,821
| 10,773,039,817
|
IssuesEvent
|
2019-11-02 18:13:45
|
ppy/osu-framework
|
https://api.github.com/repos/ppy/osu-framework
|
closed
|
Mouse issue (Linux on ChromeOS)
|
compatibility input linux
|
After opening Lazer, and the intro is done, the mouse won't do anything. I clicked on the osu! logo to get started with the game, but the mouse was unresponsive, meaning, the mouse can still move. But when I click, the cursor doesn't change color. It doesn't actually click on anything at all.
(runtime.log file from the terminal): https://docs.google.com/document/d/1HWRxGtwR3rL-PVrtV0_VyRuzLuAyTlcoQhs3gs0InXo/edit?usp=sharing
Version: 2019.1029.0
Input device: USB mouse
|
True
|
Mouse issue (Linux on ChromeOS) - After opening Lazer, and the intro is done, the mouse won't do anything. I clicked on the osu! logo to get started with the game, but the mouse was unresponsive, meaning, the mouse can still move. But when I click, the cursor doesn't change color. It doesn't actually click on anything at all.
(runtime.log file from the terminal): https://docs.google.com/document/d/1HWRxGtwR3rL-PVrtV0_VyRuzLuAyTlcoQhs3gs0InXo/edit?usp=sharing
Version: 2019.1029.0
Input device: USB mouse
|
non_process
|
mouse issue linux on chromeos after opening lazer and the intro is done the mouse won t do anything i clicked on the osu logo to get started with the game but the mouse was unresponsive meaning the mouse can still move but when i click the cursor doesn t change color it doesn t actually click on anything at all runtime log file from the terminal version input device usb mouse
| 0
|
6,097
| 8,958,124,873
|
IssuesEvent
|
2019-01-27 11:39:35
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[FEATURE] Export processing models as PDF/SVG
|
Automatic new feature Processing
|
Original commit: https://github.com/qgis/QGIS/commit/f54476cddaf70096c585c984c5963fdaf5530931 by nirvn
Unfortunately this naughty coder did not write a description... :-(
|
1.0
|
[FEATURE] Export processing models as PDF/SVG - Original commit: https://github.com/qgis/QGIS/commit/f54476cddaf70096c585c984c5963fdaf5530931 by nirvn
Unfortunately this naughty coder did not write a description... :-(
|
process
|
export processing models as pdf svg original commit by nirvn unfortunately this naughty coder did not write a description
| 1
|
597
| 3,072,026,223
|
IssuesEvent
|
2015-08-19 15:03:59
|
processing/processing
|
https://api.github.com/repos/processing/processing
|
closed
|
Parsing generic fails when package is specified (also a problem with Map.Entry)
|
imported preprocessor
|
_Original author: cpau...@gmail.com (October 26, 2011 16:27:52)_
If you attempt to declare something that takes a generic parameter while using the full package name Processing throws a Syntax error. For example:
java.util.ArrayList<Integer> stuff;
Throws the error "Syntax error on token "<", TypeArgumentList1 expected after this token".
I am using Processing 1.5.1 on Mac OS 10.7.2.
_Original issue: http://code.google.com/p/processing/issues/detail?id=880_
|
1.0
|
Parsing generic fails when package is specified (also a problem with Map.Entry) - _Original author: cpau...@gmail.com (October 26, 2011 16:27:52)_
If you attempt to declare something that takes a generic parameter while using the full package name Processing throws a Syntax error. For example:
java.util.ArrayList<Integer> stuff;
Throws the error "Syntax error on token "<", TypeArgumentList1 expected after this token".
I am using Processing 1.5.1 on Mac OS 10.7.2.
_Original issue: http://code.google.com/p/processing/issues/detail?id=880_
|
process
|
parsing generic fails when package is specified also a problem with map entry original author cpau gmail com october if you attempt to declare something that takes a generic parameter while using the full package name processing throws a syntax error for example java util arraylist lt integer gt stuff throws the error quot syntax error on token quot lt quot expected after this token quot i am using processing on mac os original issue
| 1
|
9,810
| 12,822,978,548
|
IssuesEvent
|
2020-07-06 10:48:19
|
solid/process
|
https://api.github.com/repos/solid/process
|
closed
|
Proposal to Clean up Repos to Avoid Wiki Rot
|
process proposal
|
On https://github.com/solid/process/issues/180 there was a conversation about a repo naming system and on https://github.com/solid/process/pull/172 there is a repository overview.
There are 117 repositories in github.com/solid and it is not easy for newcomers nor for people working on the repositories for some time to navigate between these repositories in a way that it is crystal clear what is the aim of each of the repositories.
The Process repository was started as an attempt to collectively agree on how to work on specific aims within github.com/solid including: administration of solid properties, standardisation work, and creating and maintaining the website solidproject.org. There are more activities going on in github.com/solid that the three just mentioned.
This is a proposal about how to gain clarity around the aims of the work happening in github.com/solid as well as clarity around who is responsible for what.
There are some repositories that used to fill the functions of the repositories already described here above but are no longer maintained by defined people. The key information needs to be combined with the repositories above and archived to avoid thinning of information and wiki rot.
| Repository | What needs to happen before archiving |Where this is now taking place instead |
| ------------- | ------------- | ------------- |
| https://github.com/solid/Explaining-the-Vision-Panel | no action needed | Solidproject.org |
| https://github.com/solid/webid-oidc-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/web-access-control-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid.mit.edu | no action | solid.mit.edu no longer attached to this repo |
| https://github.com/solid/websci-2019 | no action | solidproject.org/press|
| https://github.com/solid/Roadmap |no action | solidproject.org/roadmap (ticket to be created) |
| https://github.com/solid/solid-apps | no action | solidproject.org/use-solid |
| https://github.com/solid/pods | no action | solidproject.org/use-solid |
| https://github.com/solid/solid-idp-list | no action | solidproject.org/use-solid |
| https://github.com/solid/solid-idps | no action | solidproject.org/use-solid |
| https://github.com/solid/information | no action | solidproject.org|
| https://github.com/solid/context | no action | solidproject.org.for-developers |
| https://github.com/solid/vocab | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-namespace | no action | solidproject.org.for-developers |
| https://github.com/solid/dweb-summit-2018 | no action | solidproject.org.for-developers |
| https://github.com/solid/talks | no action | solidproject.org.for-developers |
| https://github.com/solid/intro-to-solid-slides | no action | solidproject.org.for-developers |
|https://github.com/solid/profile-viewer-tutorial | no action | solidproject.org.for-developers |
|https://github.com/solid/solid-tutorial-angular | no action | solidproject.org.for-developers |
|https://github.com/solid/solid-tutorial-rdflib.js | no action | solidproject.org.for-developers |
|https://github.com/solid/understanding-linked-data | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-tutorial-pastebin | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-tutorial-intro | no action | solidproject.org.for-developers |
## Solid Research
In the remaining ~50% of the repositories of github.com/solid there are a range of experiments and research on Solid. The aim of governance of the experimental research is not described in the process repository and largely started during the Solid MIT research project and has been picked up by the University of Ghent in more recent years.
Some of the research works on implementing the Solid standard to ensure the proposals are feasible. There is not a defined intention to provide this software as a service to end-users with a defined service provider, although some users do so organically. In particular node solid server is used by many developers as a reference Pod when building Solid applications.
Here are a list of repositories that could be tagged as 'research':
There are various implementations of the Solid specification.
| Implementation of Solid Specification | Associated Repositories |
| ------------- | ------------- |
| [Implementation of Solid Server (Pod)](https://github.com/orgs/solid/projects/2) | [node-solid-server](https://github.com/solid/node-solid-server), [node-solid-ws](https://github.com/solid/node-solid-ws) | [Jackson Morgan](https://github.com/jaxoncreed), Michiel de Jong |
| [Data Browser (app)](https://github.com/orgs/solid/projects/4) | [solid-ui](https://github.com/solid/solid-ui), [mashlib](https://github.com/solid/mashlib), [solid-panes](https://github.com/solid/solid-panes), [Chat Pane](https://github.com/solid/chat-pane), [Solid Pane Source](https://github.com/solid/solid-pane-source), [Source Pane](https://github.com/solid/source-pane), [Issue Pane](https://github.com/solid/issue-pane), [Contacts Pane](https://github.com/solid/contacts-pane), [Folder Pane](https://github.com/solid/folder-pane), [Meeting Pane](https://github.com/solid/meeting-pane), [Pane Registry](https://github.com/solid/pane-registry), [userguide](https://github.com/solid/userguide) | Arne Hassel, Tim Berners-Lee, Vincent Tunru, Kevin Howard, Daphne |
| example applications | [profile-viewer-react](https://github.com/solid/profile-viewer-react), [solid-connections-ui](https://github.com/solid/solid-connections-ui), [solid-profile-ui](https://github.com/solid/solid-profile-ui), [solid-dashboard-ui](https://github.com/solid/solid-dashboard-ui), [solid-signup-ui](https://github.com/solid/solid-signup-ui), [solid-signin-ui](https://github.com/solid/solid-signin-ui), [solid-sign-up](https://github.com/solid/solid-sign-up), [solid zagel](https://github.com/solid/solid-zagel) |
| a way to take data from gitter chat and move it into Solid | [gitter-solid](https://github.com/solid/gitter-solid) |
There are various Solid-related libraries mostly being led by [Ruben Verborgh](https://github.com/RubenVerborgh).
| Description | Associated Repositories |
| ------------- | ------------- |
| An archive of built versions of various Solid-related libraries | [releases](https://github.com/solid/releases) |
| authentication tools | [solid-auth-client](https://github.com/solid/solid-auth-client), [solid-auth-oidc](https://github.com/solid/solid-auth-oidc), [solid-auth-tls](https://github.com/solid/solid-auth-tls), [oidc-auth-manager](https://github.com/solid/oidc-auth-manager), [solid-cli](https://github.com/solid/solid-cli), [solid-client](https://github.com/solid/solid-client), [solid-multi-rp-client](https://github.com/solid/solid-multi-rp-client), [oidc-web](https://github.com/solid/oidc-web), [oidc-op](https://github.com/solid/oidc-op), [oidc-rp](https://github.com/solid/oidc-rp)[oidc-rs](https://github.com/solid/oidc-rs), [keychain](https://github.com/solid/keychain), [jose](https://github.com/solid/jose), [wac-allow](https://github.com/solid/wac-allow) |
| authorisation tools | [acl-check](https://github.com/solid/acl-check), [solid-permissions](https://github.com/solid/solid-permissions) |
| client-side libraries | [react-components](https://github.com/solid/react-components), [form-playground](https://github.com/solid/form-playground) |
| querying tools | [query-ldflex](https://github.com/solid/query-ldflex), [ldflex-playground](https://github.com/solid/ldflex-playground), [solid-tpf](https://github.com/solid/solid-tpf)|
| a description of one way to implement the specification | [solid-architecture](https://github.com/solid/solid-architecture) |
Folllowing is a list of other Solid research:
- [solid-email](https://github.com/solid/solid-email)
- [ldp-glob](https://github.com/solid/ldp-glob)
- [test-idp](https://github.com/solid/test-idp)
- [solid-takeout-import](https://github.com/solid/solid-takeout-import)
- [solid-inbox](https://github.com/solid/solid-inbox)
- [kvplus-files](https://github.com/solid/kvplus-files)
- [mavo-solid](https://github.com/solid/mavo-solid)
- [solid-web-client](https://github.com/solid/solid-web-client)
- [solid-notifications](https://github.com/solid/solid-notifications)
- [resource-access](https://github.com/solid/resource-access)
- [solid platform](https://github.com/solid/solid-platform)
- [solid signup](https://github.com/solid/solid-signup)
|
1.0
|
Proposal to Clean up Repos to Avoid Wiki Rot - On https://github.com/solid/process/issues/180 there was a conversation about a repo naming system and on https://github.com/solid/process/pull/172 there is a repository overview.
There are 117 repositories in github.com/solid and it is not easy for newcomers nor for people working on the repositories for some time to navigate between these repositories in a way that it is crystal clear what is the aim of each of the repositories.
The Process repository was started as an attempt to collectively agree on how to work on specific aims within github.com/solid including: administration of solid properties, standardisation work, and creating and maintaining the website solidproject.org. There are more activities going on in github.com/solid that the three just mentioned.
This is a proposal about how to gain clarity around the aims of the work happening in github.com/solid as well as clarity around who is responsible for what.
There are some repositories that used to fill the functions of the repositories already described here above but are no longer maintained by defined people. The key information needs to be combined with the repositories above and archived to avoid thinning of information and wiki rot.
| Repository | What needs to happen before archiving |Where this is now taking place instead |
| ------------- | ------------- | ------------- |
| https://github.com/solid/Explaining-the-Vision-Panel | no action needed | Solidproject.org |
| https://github.com/solid/webid-oidc-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/web-access-control-spec | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid | Move issues and pull requests to github.com/solid/specification | github.com/solid/specification |
| https://github.com/solid/solid.mit.edu | no action | solid.mit.edu no longer attached to this repo |
| https://github.com/solid/websci-2019 | no action | solidproject.org/press|
| https://github.com/solid/Roadmap |no action | solidproject.org/roadmap (ticket to be created) |
| https://github.com/solid/solid-apps | no action | solidproject.org/use-solid |
| https://github.com/solid/pods | no action | solidproject.org/use-solid |
| https://github.com/solid/solid-idp-list | no action | solidproject.org/use-solid |
| https://github.com/solid/solid-idps | no action | solidproject.org/use-solid |
| https://github.com/solid/information | no action | solidproject.org|
| https://github.com/solid/context | no action | solidproject.org.for-developers |
| https://github.com/solid/vocab | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-namespace | no action | solidproject.org.for-developers |
| https://github.com/solid/dweb-summit-2018 | no action | solidproject.org.for-developers |
| https://github.com/solid/talks | no action | solidproject.org.for-developers |
| https://github.com/solid/intro-to-solid-slides | no action | solidproject.org.for-developers |
|https://github.com/solid/profile-viewer-tutorial | no action | solidproject.org.for-developers |
|https://github.com/solid/solid-tutorial-angular | no action | solidproject.org.for-developers |
|https://github.com/solid/solid-tutorial-rdflib.js | no action | solidproject.org.for-developers |
|https://github.com/solid/understanding-linked-data | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-tutorial-pastebin | no action | solidproject.org.for-developers |
| https://github.com/solid/solid-tutorial-intro | no action | solidproject.org.for-developers |
## Solid Research
In the remaining ~50% of the repositories of github.com/solid there are a range of experiments and research on Solid. The aim of governance of the experimental research is not described in the process repository and largely started during the Solid MIT research project and has been picked up by the University of Ghent in more recent years.
Some of the research works on implementing the Solid standard to ensure the proposals are feasible. There is not a defined intention to provide this software as a service to end-users with a defined service provider, although some users do so organically. In particular node solid server is used by many developers as a reference Pod when building Solid applications.
Here are a list of repositories that could be tagged as 'research':
There are various implementations of the Solid specification.
| Implementation of Solid Specification | Associated Repositories |
| ------------- | ------------- |
| [Implementation of Solid Server (Pod)](https://github.com/orgs/solid/projects/2) | [node-solid-server](https://github.com/solid/node-solid-server), [node-solid-ws](https://github.com/solid/node-solid-ws) | [Jackson Morgan](https://github.com/jaxoncreed), Michiel de Jong |
| [Data Browser (app)](https://github.com/orgs/solid/projects/4) | [solid-ui](https://github.com/solid/solid-ui), [mashlib](https://github.com/solid/mashlib), [solid-panes](https://github.com/solid/solid-panes), [Chat Pane](https://github.com/solid/chat-pane), [Solid Pane Source](https://github.com/solid/solid-pane-source), [Source Pane](https://github.com/solid/source-pane), [Issue Pane](https://github.com/solid/issue-pane), [Contacts Pane](https://github.com/solid/contacts-pane), [Folder Pane](https://github.com/solid/folder-pane), [Meeting Pane](https://github.com/solid/meeting-pane), [Pane Registry](https://github.com/solid/pane-registry), [userguide](https://github.com/solid/userguide) | Arne Hassel, Tim Berners-Lee, Vincent Tunru, Kevin Howard, Daphne |
| example applications | [profile-viewer-react](https://github.com/solid/profile-viewer-react), [solid-connections-ui](https://github.com/solid/solid-connections-ui), [solid-profile-ui](https://github.com/solid/solid-profile-ui), [solid-dashboard-ui](https://github.com/solid/solid-dashboard-ui), [solid-signup-ui](https://github.com/solid/solid-signup-ui), [solid-signin-ui](https://github.com/solid/solid-signin-ui), [solid-sign-up](https://github.com/solid/solid-sign-up), [solid zagel](https://github.com/solid/solid-zagel) |
| a way to take data from gitter chat and move it into Solid | [gitter-solid](https://github.com/solid/gitter-solid) |
There are various Solid-related libraries mostly being led by [Ruben Verborgh](https://github.com/RubenVerborgh).
| Description | Associated Repositories |
| ------------- | ------------- |
| An archive of built versions of various Solid-related libraries | [releases](https://github.com/solid/releases) |
| authentication tools | [solid-auth-client](https://github.com/solid/solid-auth-client), [solid-auth-oidc](https://github.com/solid/solid-auth-oidc), [solid-auth-tls](https://github.com/solid/solid-auth-tls), [oidc-auth-manager](https://github.com/solid/oidc-auth-manager), [solid-cli](https://github.com/solid/solid-cli), [solid-client](https://github.com/solid/solid-client), [solid-multi-rp-client](https://github.com/solid/solid-multi-rp-client), [oidc-web](https://github.com/solid/oidc-web), [oidc-op](https://github.com/solid/oidc-op), [oidc-rp](https://github.com/solid/oidc-rp)[oidc-rs](https://github.com/solid/oidc-rs), [keychain](https://github.com/solid/keychain), [jose](https://github.com/solid/jose), [wac-allow](https://github.com/solid/wac-allow) |
| authorisation tools | [acl-check](https://github.com/solid/acl-check), [solid-permissions](https://github.com/solid/solid-permissions) |
| client-side libraries | [react-components](https://github.com/solid/react-components), [form-playground](https://github.com/solid/form-playground) |
| querying tools | [query-ldflex](https://github.com/solid/query-ldflex), [ldflex-playground](https://github.com/solid/ldflex-playground), [solid-tpf](https://github.com/solid/solid-tpf)|
| a description of one way to implement the specification | [solid-architecture](https://github.com/solid/solid-architecture) |
Folllowing is a list of other Solid research:
- [solid-email](https://github.com/solid/solid-email)
- [ldp-glob](https://github.com/solid/ldp-glob)
- [test-idp](https://github.com/solid/test-idp)
- [solid-takeout-import](https://github.com/solid/solid-takeout-import)
- [solid-inbox](https://github.com/solid/solid-inbox)
- [kvplus-files](https://github.com/solid/kvplus-files)
- [mavo-solid](https://github.com/solid/mavo-solid)
- [solid-web-client](https://github.com/solid/solid-web-client)
- [solid-notifications](https://github.com/solid/solid-notifications)
- [resource-access](https://github.com/solid/resource-access)
- [solid platform](https://github.com/solid/solid-platform)
- [solid signup](https://github.com/solid/solid-signup)
|
process
|
proposal to clean up repos to avoid wiki rot on there was a conversation about a repo naming system and on there is a repository overview there are repositories in github com solid and it is not easy for newcomers nor for people working on the repositories for some time to navigate between these repositories in a way that it is crystal clear what is the aim of each of the repositories the process repository was started as an attempt to collectively agree on how to work on specific aims within github com solid including administration of solid properties standardisation work and creating and maintaining the website solidproject org there are more activities going on in github com solid that the three just mentioned this is a proposal about how to gain clarity around the aims of the work happening in github com solid as well as clarity around who is responsible for what there are some repositories that used to fill the functions of the repositories already described here above but are no longer maintained by defined people the key information needs to be combined with the repositories above and archived to avoid thinning of information and wiki rot repository what needs to happen before archiving where this is now taking place instead no action needed solidproject org move issues and pull requests to github com solid specification github com solid specification move issues and pull requests to github com solid specification github com solid specification move issues and pull requests to github com solid specification github com solid specification move issues and pull requests to github com solid specification github com solid specification no action solid mit edu no longer attached to this repo no action solidproject org press no action solidproject org roadmap ticket to be created no action solidproject org use solid no action solidproject org use solid no action solidproject org use solid no action solidproject org use solid no action solidproject org no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers no action solidproject org for developers solid research in the remaining of the repositories of github com solid there are a range of experiments and research on solid the aim of governance of the experimental research is not described in the process repository and largely started during the solid mit research project and has been picked up by the university of ghent in more recent years some of the research works on implementing the solid standard to ensure the proposals are feasible there is not a defined intention to provide this software as a service to end users with a defined service provider although some users do so organically in particular node solid server is used by many developers as a reference pod when building solid applications here are a list of repositories that could be tagged as research there are various implementations of the solid specification implementation of solid specification associated repositories michiel de jong arne hassel tim berners lee vincent tunru kevin howard daphne example applications a way to take data from gitter chat and move it into solid there are various solid related libraries mostly being led by description associated repositories an archive of built versions of various solid related libraries authentication tools authorisation tools client side libraries querying tools a description of one way to implement the specification folllowing is a list of other solid research
| 1
|
646,687
| 21,056,510,569
|
IssuesEvent
|
2022-04-01 04:15:32
|
AY2122S2-CS2103T-W14-3/tp
|
https://api.github.com/repos/AY2122S2-CS2103T-W14-3/tp
|
closed
|
As a user keeping track of many events, I can delete all past events
|
priority: medium
|
so that I can avoid having to delete them one by one
|
1.0
|
As a user keeping track of many events, I can delete all past events - so that I can avoid having to delete them one by one
|
non_process
|
as a user keeping track of many events i can delete all past events so that i can avoid having to delete them one by one
| 0
|
18,030
| 24,037,135,195
|
IssuesEvent
|
2022-09-15 20:18:54
|
magland/spikesortingview
|
https://api.github.com/repos/magland/spikesortingview
|
closed
|
annotations: time intervals
|
in process
|
* First need a mechanism to select an interval of interest. I think shift+drag is a natural choice.
* Add that to the recording selection state.
* Make a annotation type 'time-interval'
* Display the time-interval annotations in TimeScrollView
* Click annotation to go to that interval - perhaps selected
* Figure out zooming - when clicking a time-interval annotation perhaps we want to zoom in to it.
|
1.0
|
annotations: time intervals - * First need a mechanism to select an interval of interest. I think shift+drag is a natural choice.
* Add that to the recording selection state.
* Make a annotation type 'time-interval'
* Display the time-interval annotations in TimeScrollView
* Click annotation to go to that interval - perhaps selected
* Figure out zooming - when clicking a time-interval annotation perhaps we want to zoom in to it.
|
process
|
annotations time intervals first need a mechanism to select an interval of interest i think shift drag is a natural choice add that to the recording selection state make a annotation type time interval display the time interval annotations in timescrollview click annotation to go to that interval perhaps selected figure out zooming when clicking a time interval annotation perhaps we want to zoom in to it
| 1
|
7,797
| 10,957,819,958
|
IssuesEvent
|
2019-11-27 08:00:02
|
didi/mpx
|
https://api.github.com/repos/didi/mpx
|
closed
|
使用自定义tabbar,使用this.getTabBar方法时,ts报错
|
processing
|
**问题描述**
使用自定义tabbar,使用this.getTabBar方法时,ts报错:
Property 'getTabBar' does not exist on type 'ComponentIns
**复现步骤**
使用ts,createPage() onShow中调用 this.getTabBar
引用微信官方文档:
如需实现 tab 选中态,要在当前页面下,通过 getTabBar 接口获取组件实例,并调用 setData 更新选中态。
报错提示:
Property 'getTabBar' does not exist on type 'ComponentIns<{ showGradeModal: boolean; }, {}, { courseList(): CardBlock; locationSetted(): boolean; }, { asyncData(): Promise<[void, void]>; switchGradeModal(): void; getSift: () => Promise<void>; getCourses: () => Promise<...>; getADS: () => Promise<...>; }, []>'
目前我添加了// @ts-ignore 忽略了该报错
|
1.0
|
使用自定义tabbar,使用this.getTabBar方法时,ts报错 - **问题描述**
使用自定义tabbar,使用this.getTabBar方法时,ts报错:
Property 'getTabBar' does not exist on type 'ComponentIns
**复现步骤**
使用ts,createPage() onShow中调用 this.getTabBar
引用微信官方文档:
如需实现 tab 选中态,要在当前页面下,通过 getTabBar 接口获取组件实例,并调用 setData 更新选中态。
报错提示:
Property 'getTabBar' does not exist on type 'ComponentIns<{ showGradeModal: boolean; }, {}, { courseList(): CardBlock; locationSetted(): boolean; }, { asyncData(): Promise<[void, void]>; switchGradeModal(): void; getSift: () => Promise<void>; getCourses: () => Promise<...>; getADS: () => Promise<...>; }, []>'
目前我添加了// @ts-ignore 忽略了该报错
|
process
|
使用自定义tabbar,使用this gettabbar方法时,ts报错 问题描述 使用自定义tabbar,使用this gettabbar方法时,ts报错: property gettabbar does not exist on type componentins 复现步骤 使用ts,createpage onshow中调用 this gettabbar 引用微信官方文档: 如需实现 tab 选中态,要在当前页面下,通过 gettabbar 接口获取组件实例,并调用 setdata 更新选中态。 报错提示: property gettabbar does not exist on type componentins switchgrademodal void getsift promise getcourses promise getads promise 目前我添加了 ts ignore 忽略了该报错
| 1
|
581,499
| 17,294,848,901
|
IssuesEvent
|
2021-07-25 14:12:45
|
lokka30/LevelledMobs
|
https://api.github.com/repos/lokka30/LevelledMobs
|
opened
|
Other: LM 3.1 To-do List
|
need more info priority: high thoughts wanted
|
Edit this post if you want to edit the to-do list. Add comments if you wish.
* @lokka30 - post-release edit spigotmc description, move the 'ceasing support for ...' announcements, add them into 'very important info' to state that J11 and 1.16.x minimum.
* @lokka30 or anyone else optionally - at-release edit 'compiling levelledmobs' wiki page, remove the entire section containing the environment variables setup as we no longer use MM. Penal is a saviour for that, among many other things
* @lokka30 or anyone else optionally - pre-release Compile a changelog of all the changes.
* @lokka30 or anyone else optionally - at-release edit installation instructions and compatibility page to make it clear that Java11 and 1.16.x are minimum versions supported for java and MC
|
1.0
|
Other: LM 3.1 To-do List - Edit this post if you want to edit the to-do list. Add comments if you wish.
* @lokka30 - post-release edit spigotmc description, move the 'ceasing support for ...' announcements, add them into 'very important info' to state that J11 and 1.16.x minimum.
* @lokka30 or anyone else optionally - at-release edit 'compiling levelledmobs' wiki page, remove the entire section containing the environment variables setup as we no longer use MM. Penal is a saviour for that, among many other things
* @lokka30 or anyone else optionally - pre-release Compile a changelog of all the changes.
* @lokka30 or anyone else optionally - at-release edit installation instructions and compatibility page to make it clear that Java11 and 1.16.x are minimum versions supported for java and MC
|
non_process
|
other lm to do list edit this post if you want to edit the to do list add comments if you wish post release edit spigotmc description move the ceasing support for announcements add them into very important info to state that and x minimum or anyone else optionally at release edit compiling levelledmobs wiki page remove the entire section containing the environment variables setup as we no longer use mm penal is a saviour for that among many other things or anyone else optionally pre release compile a changelog of all the changes or anyone else optionally at release edit installation instructions and compatibility page to make it clear that and x are minimum versions supported for java and mc
| 0
|
5,623
| 8,481,780,216
|
IssuesEvent
|
2018-10-25 16:37:36
|
aspnet/IISIntegration
|
https://api.github.com/repos/aspnet/IISIntegration
|
closed
|
Test failure: AppOfflineDroppedWhileSiteStarting_SiteShutsDown_InProcess under App Verifier
|
in-process test-failure
|
I think the failure is caused by a race between stopping stdout redirect and debugutil logging code.
|
1.0
|
Test failure: AppOfflineDroppedWhileSiteStarting_SiteShutsDown_InProcess under App Verifier - I think the failure is caused by a race between stopping stdout redirect and debugutil logging code.
|
process
|
test failure appofflinedroppedwhilesitestarting siteshutsdown inprocess under app verifier i think the failure is caused by a race between stopping stdout redirect and debugutil logging code
| 1
|
95,722
| 12,035,314,707
|
IssuesEvent
|
2020-04-13 17:39:44
|
EraAmate/invita-project
|
https://api.github.com/repos/EraAmate/invita-project
|
closed
|
Add components to storybook
|
design storybook
|
- [x] Create components for basis layout elements in storybook
- [x] Try to use knobs (for click, check)
- [x] Deploy storybook
|
1.0
|
Add components to storybook - - [x] Create components for basis layout elements in storybook
- [x] Try to use knobs (for click, check)
- [x] Deploy storybook
|
non_process
|
add components to storybook create components for basis layout elements in storybook try to use knobs for click check deploy storybook
| 0
|
240,081
| 7,800,380,766
|
IssuesEvent
|
2018-06-09 08:43:06
|
tine20/Tine-2.0-Open-Source-Groupware-and-CRM
|
https://api.github.com/repos/tine20/Tine-2.0-Open-Source-Groupware-and-CRM
|
closed
|
0009004:
sieve + tls problem: Could not authenticate with user xy (Plaintext authentication disabled.)
|
Bug Felamimail Mantis high priority
|
**Reported by martin on 6 Oct 2013 23:39**
**Version:** Kristina (2013.03.8)
see https://www.tine20.org/forum/viewtopic.php?uid=244&f=12&t=11650&start=0
|
1.0
|
0009004:
sieve + tls problem: Could not authenticate with user xy (Plaintext authentication disabled.) - **Reported by martin on 6 Oct 2013 23:39**
**Version:** Kristina (2013.03.8)
see https://www.tine20.org/forum/viewtopic.php?uid=244&f=12&t=11650&start=0
|
non_process
|
sieve tls problem could not authenticate with user xy plaintext authentication disabled reported by martin on oct version kristina see
| 0
|
63,459
| 8,679,940,650
|
IssuesEvent
|
2018-12-01 04:12:12
|
doctrine/doctrine2
|
https://api.github.com/repos/doctrine/doctrine2
|
opened
|
Drop legacy PEAR stuff
|
Documentation Improvement
|
PEAR distribution is deprecated since ORM 2.4 and our pear channel on doctrine-project.org doesn't work anymore anyway. It should be dropped:
- any occurences in docs,
- support in `bin/doctrine.bat`,
- `bin/doctrine-pear.php` altogether.
|
1.0
|
Drop legacy PEAR stuff - PEAR distribution is deprecated since ORM 2.4 and our pear channel on doctrine-project.org doesn't work anymore anyway. It should be dropped:
- any occurences in docs,
- support in `bin/doctrine.bat`,
- `bin/doctrine-pear.php` altogether.
|
non_process
|
drop legacy pear stuff pear distribution is deprecated since orm and our pear channel on doctrine project org doesn t work anymore anyway it should be dropped any occurences in docs support in bin doctrine bat bin doctrine pear php altogether
| 0
|
17,007
| 22,386,210,031
|
IssuesEvent
|
2022-06-17 00:50:53
|
figlesias221/ProyectoDevOps_Grupo3_IglesiasPerezMolinoloJuan
|
https://api.github.com/repos/figlesias221/ProyectoDevOps_Grupo3_IglesiasPerezMolinoloJuan
|
closed
|
Retrospectiva Iteración 2
|
process
|
Esfuerzo en HS-P: (por persona)
- Estimado: 1
- Real: 1 (@matiasmolinolo , @andrujuanoo , @mperezjodal , @figlesias221 )
Usando DAKI.
|
1.0
|
Retrospectiva Iteración 2 - Esfuerzo en HS-P: (por persona)
- Estimado: 1
- Real: 1 (@matiasmolinolo , @andrujuanoo , @mperezjodal , @figlesias221 )
Usando DAKI.
|
process
|
retrospectiva iteración esfuerzo en hs p por persona estimado real matiasmolinolo andrujuanoo mperezjodal usando daki
| 1
|
14,401
| 17,456,124,360
|
IssuesEvent
|
2021-08-06 01:39:02
|
MitchellCodes/eCommerceSite
|
https://api.github.com/repos/MitchellCodes/eCommerceSite
|
closed
|
Add CI Pipeline
|
development process
|
Add continuous integration pipeline that will check to make sure code in a pull request compiles successfully.
|
1.0
|
Add CI Pipeline - Add continuous integration pipeline that will check to make sure code in a pull request compiles successfully.
|
process
|
add ci pipeline add continuous integration pipeline that will check to make sure code in a pull request compiles successfully
| 1
|
24,514
| 6,548,308,438
|
IssuesEvent
|
2017-09-04 20:41:23
|
michaelpb/whiteboard
|
https://api.github.com/repos/michaelpb/whiteboard
|
closed
|
Global menu system for Deck and Start
|
App Features Code Quality enhancement
|
Refactor Open and New into a Global Menu system that Deck does setApplicationMenu
(no need for now for context menu)
- Move ugly left-hand menu to popular actions (New, Import, Open) as big menu items on top of Start
|
1.0
|
Global menu system for Deck and Start - Refactor Open and New into a Global Menu system that Deck does setApplicationMenu
(no need for now for context menu)
- Move ugly left-hand menu to popular actions (New, Import, Open) as big menu items on top of Start
|
non_process
|
global menu system for deck and start refactor open and new into a global menu system that deck does setapplicationmenu no need for now for context menu move ugly left hand menu to popular actions new import open as big menu items on top of start
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.