Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
19,651
| 26,009,696,425
|
IssuesEvent
|
2022-12-20 23:38:24
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
closed
|
Release production workflow fails to publish helm chart
|
bug process
|
### Description
When triggered by tag `v0.71.0-beta3`, the release production workflow failed at the job `Publish helm chart`:
```
remote: Permission to hashgraph/hedera-mirror-node.git denied to github-actions[bot].
fatal: unable to access 'https://github.com/hashgraph/hedera-mirror-node/': The requested URL returned error: 403
```
The root cause is the changes introduced in the marketplace PR restricted the github token permission to read-only for contents while the above step needs write permission to push commits to the repo
### Steps to reproduce
https://github.com/hashgraph/hedera-mirror-node/actions/runs/3744125506/jobs/6357441458
### Additional context
_No response_
### Hedera network
other
### Version
v0.71.0-SNAPSHOT
### Operating system
None
|
1.0
|
Release production workflow fails to publish helm chart - ### Description
When triggered by tag `v0.71.0-beta3`, the release production workflow failed at the job `Publish helm chart`:
```
remote: Permission to hashgraph/hedera-mirror-node.git denied to github-actions[bot].
fatal: unable to access 'https://github.com/hashgraph/hedera-mirror-node/': The requested URL returned error: 403
```
The root cause is the changes introduced in the marketplace PR restricted the github token permission to read-only for contents while the above step needs write permission to push commits to the repo
### Steps to reproduce
https://github.com/hashgraph/hedera-mirror-node/actions/runs/3744125506/jobs/6357441458
### Additional context
_No response_
### Hedera network
other
### Version
v0.71.0-SNAPSHOT
### Operating system
None
|
process
|
release production workflow fails to publish helm chart description when triggered by tag the release production workflow failed at the job publish helm chart remote permission to hashgraph hedera mirror node git denied to github actions fatal unable to access the requested url returned error the root cause is the changes introduced in the marketplace pr restricted the github token permission to read only for contents while the above step needs write permission to push commits to the repo steps to reproduce additional context no response hedera network other version snapshot operating system none
| 1
|
12,714
| 15,088,769,322
|
IssuesEvent
|
2021-02-06 02:09:03
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Sandboxing on tables with remapped FK (Display Values) causes query to fail
|
.Regression .Reproduced Administration/Data Sandboxes Priority:P1 Querying/Nested Queries Querying/Processor Type:Bug
|
**Describe the bug**
When using column sandboxing on a table with remapped FK (Display Values), then the sandboxed query fails on 1.38.0-rc4.
This works in 1.37.8.
**To Reproduce**
1. Admin > Data Model > Sample Dataset > Reviews > Product ID :gear: > Display Values = Use foreign key: `Products.Title`

2. Admin > People > create user "U1" and set attributes `user_id`=`1`
3. Admin > Permissions, revoke all collection and data permissions, and grant full access to Products, and sandbox to Reviews:

4. Login as user "U1" and go to Reviews table, which fails with `Value does not match schema: {:query {:fields (named (not ("distinct" a-clojure.lang.PersistentVector)) "Distinct, non-empty sequence of Field clauses")}}`

5. Same instance, just running 1.37.8 works:

<details><summary>Full stacktrace</summary>
```
2021-02-02 15:59:48,158 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 1,
:started_at #t "2021-02-02T15:59:47.466413+01:00[Europe/Copenhagen]",
:error_type :invalid-query,
:json_query {:database 1, :query {:source-table 4}, :type "query", :parameters [], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native nil,
:status :failed,
:class clojure.lang.ExceptionInfo,
:stacktrace
["--> util.schema$schema_core_validator$fn__17782.invoke(schema.clj:29)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals_STAR_.invokeStatic(wrap_value_literals.clj:137)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals_STAR_.invoke(wrap_value_literals.clj:133)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40651.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40536.invoke(annotate.clj:578)"
"query_processor.middleware.permissions$check_query_permissions$fn__46234.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47767.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46432.invoke(cumulative_aggregations.clj:60)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:327)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__48004.invoke(resolve_joined_fields.clj:35)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__48323.invoke(resolve_joins.clj:184)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44755.invoke(add_implicit_joins.clj:249)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47044.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__47024.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__46498.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45516.invoke(binning.clj:225)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46034.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44303.invoke(add_dimension_projections.clj:315)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44506.invoke(add_implicit_clauses.clj:138)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:327)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44904.invoke(add_source_metadata.clj:103)"
"metabase_enterprise.sandbox.query_processor.middleware.column_level_perms_check$maybe_apply_column_level_perms_check$fn__48737.invoke(column_level_perms_check.clj:25)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47964.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45104.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46081.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__47749.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46133.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46754.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44913.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__48685.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975$fn__47979.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46972.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__48694$fn__48695.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:42)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__48694.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48703.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47096.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44773.invoke(add_rows_truncated.clj:35)"
"metabase_enterprise.audit.query_processor.middleware.handle_audit_queries$handle_internal_queries$fn__31273.invoke(handle_audit_queries.clj:162)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__48670.invoke(results_metadata.clj:146)"
"query_processor.reducible$async_qp$qp_STAR___33081$thunk__33082.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___33081.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___33090$fn__33093.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___33090.invoke(reducible.clj:134)"
"query_processor$preprocess_query.invokeStatic(query_processor.clj:162)"
"query_processor$preprocess_query.invoke(query_processor.clj:154)"
"query_processor$query__GT_preprocessed.invokeStatic(query_processor.clj:168)"
"query_processor$query__GT_preprocessed.invoke(query_processor.clj:164)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952$fn__48953$fn__48954.invoke(row_level_restrictions.clj:135)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952$fn__48953.invoke(row_level_restrictions.clj:134)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952.invoke(row_level_restrictions.clj:129)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__49051$gtap__GT_source__49056$fn__49060.invoke(row_level_restrictions.clj:207)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__49051$gtap__GT_source__49056.invoke(row_level_restrictions.clj:195)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtap.invokeStatic(row_level_restrictions.clj:258)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtap.invoke(row_level_restrictions.clj:244)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps$replace_49108__49109.invoke(row_level_restrictions.clj:268)"
"mbql.util.match$replace_in_collection$iter__19339__19343$fn__19344.invoke(match.clj:139)"
"mbql.util.match$replace_in_collection.invokeStatic(match.clj:138)"
"mbql.util.match$replace_in_collection.invoke(match.clj:133)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps$replace_49108__49109.invoke(row_level_restrictions.clj:268)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps.invokeStatic(row_level_restrictions.clj:268)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps.invoke(row_level_restrictions.clj:263)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$gtapped_query.invokeStatic(row_level_restrictions.clj:310)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$gtapped_query.invoke(row_level_restrictions.clj:307)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:321)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44904.invoke(add_source_metadata.clj:103)"
"metabase_enterprise.sandbox.query_processor.middleware.column_level_perms_check$maybe_apply_column_level_perms_check$fn__48737.invoke(column_level_perms_check.clj:25)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47964.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45104.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46081.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__47749.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46133.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46754.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44913.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__48685.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975$fn__47979.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46972.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__48694$fn__48695.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__48694.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48703.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47096.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44773.invoke(add_rows_truncated.clj:35)"
"metabase_enterprise.audit.query_processor.middleware.handle_audit_queries$handle_internal_queries$fn__31273.invoke(handle_audit_queries.clj:162)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__48670.invoke(results_metadata.clj:146)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46375.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47838.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46318.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___33081$thunk__33082.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___33081.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___33090$fn__33093.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___33090.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:235)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:231)"
"query_processor$fn__49181$process_query_and_save_execution_BANG___49190$fn__49193.invoke(query_processor.clj:247)"
"query_processor$fn__49181$process_query_and_save_execution_BANG___49190.invoke(query_processor.clj:239)"
"query_processor$fn__49225$process_query_and_save_with_max_results_constraints_BANG___49234$fn__49237.invoke(query_processor.clj:259)"
"query_processor$fn__49225$process_query_and_save_with_max_results_constraints_BANG___49234.invoke(query_processor.clj:252)"
"api.dataset$fn__63122$fn__63125.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__63103$fn__63104.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__63103.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__17489.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "Value does not match schema: {:query {:fields (named (not (\"distinct\" a-clojure.lang.PersistentVector)) \"Distinct, non-empty sequence of Field clauses\")}}",
:row_count 0,
:running_time 0,
:preprocessed nil,
:ex-data
{:type :schema.core/error,
:value
{:database 1,
:type :query,
:query
{:source-metadata
[{:name "ID", :id 36, :table_id 4, :display_name "ID", :base_type :type/BigInteger, :special_type :type/PK, :fingerprint nil, :settings nil}
{:name "PRODUCT_ID", :id 33, :table_id 4, :display_name "Product ID", :base_type :type/Integer, :special_type :type/FK, :fingerprint {:global {:distinct-count 176, :nil% 0.0}}, :settings nil}
{:name "REVIEWER",
:id 35,
:table_id 4,
:display_name "Reviewer",
:base_type :type/Text,
:special_type nil,
:fingerprint
{:global {:distinct-count 1076, :nil% 0.0},
:type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.001798561151079137, :average-length 9.972122302158274}}},
:settings nil}
{:name "RATING",
:id 31,
:table_id 4,
:display_name "Rating",
:base_type :type/Integer,
:special_type :type/Score,
:fingerprint {:global {:distinct-count 5, :nil% 0.0}, :type {:type/Number {:min 1.0, :q1 3.54744353181696, :q3 4.764807071650455, :max 5.0, :sd 1.0443899855660577, :avg 3.987410071942446}}},
:settings nil}
{:name "BODY",
:id 32,
:table_id 4,
:display_name "Body",
:base_type :type/Text,
:special_type :type/Description,
:fingerprint {:global {:distinct-count 1112, :nil% 0.0}, :type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.0, :average-length 177.41996402877697}}},
:settings nil}
{:table_id 4,
:special_type :type/CreationTimestamp,
:unit :default,
:name "CREATED_AT",
:settings nil,
:id 34,
:display_name "Created At",
:fingerprint {:global {:distinct-count 1112, :nil% 0.0}, :type {:type/DateTime {:earliest "2016-06-03T00:37:05.818Z", :latest "2020-04-19T14:15:25.677Z"}}},
:base_type :type/DateTime}
{:table_id 1,
:special_type :type/Title,
:name "TITLE",
:settings nil,
:id 5,
:display_name "Product → Title",
:fingerprint {:global {:distinct-count 199, :nil% 0.0}, :type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.0, :average-length 21.495}}},
:base_type :type/Text,
:source_alias "PRODUCTS__via__PRODUCT_ID"}],
:fields
[[:field-id 36]
[:field-id 33]
[:field-id 35]
[:field-id 31]
[:field-id 32]
[:field-id 34]
[:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]
[:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]],
:joins [{:strategy :left-join, :source-table 1, :alias "PRODUCTS__via__PRODUCT_ID", :fk-field-id 33, :condition [:= [:field-id 33] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 8]]]}],
:source-query
{:source-table 4,
:filter [:= [:field-id 33] [:value 1 {:base_type :type/Integer, :special_type :type/FK, :database_type "INTEGER", :name "PRODUCT_ID"}]],
:fields [[:field-id 36] [:field-id 33] [:field-id 35] [:field-id 31] [:field-id 32] [:datetime-field [:field-id 34] :default] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]],
:joins
[{:strategy :left-join, :source-table 1, :alias "PRODUCTS__via__PRODUCT_ID", :fk-field-id 33, :condition [:= [:field-id 33] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 8]]]}]}}},
:error {:query {:fields (named (not ("distinct" a-clojure.lang.PersistentVector)) "Distinct, non-empty sequence of Field clauses")}}},
:data {:rows [], :cols []}}
```
</details>
**Information about your Metabase Installation:**
Hash `6312927` ~1.38.0-rc4
**Additional context**
Related to https://github.com/metabase/metabase-enterprise/issues/405
|
1.0
|
Sandboxing on tables with remapped FK (Display Values) causes query to fail - **Describe the bug**
When using column sandboxing on a table with remapped FK (Display Values), then the sandboxed query fails on 1.38.0-rc4.
This works in 1.37.8.
**To Reproduce**
1. Admin > Data Model > Sample Dataset > Reviews > Product ID :gear: > Display Values = Use foreign key: `Products.Title`

2. Admin > People > create user "U1" and set attributes `user_id`=`1`
3. Admin > Permissions, revoke all collection and data permissions, and grant full access to Products, and sandbox to Reviews:

4. Login as user "U1" and go to Reviews table, which fails with `Value does not match schema: {:query {:fields (named (not ("distinct" a-clojure.lang.PersistentVector)) "Distinct, non-empty sequence of Field clauses")}}`

5. Same instance, just running 1.37.8 works:

<details><summary>Full stacktrace</summary>
```
2021-02-02 15:59:48,158 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 1,
:started_at #t "2021-02-02T15:59:47.466413+01:00[Europe/Copenhagen]",
:error_type :invalid-query,
:json_query {:database 1, :query {:source-table 4}, :type "query", :parameters [], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native nil,
:status :failed,
:class clojure.lang.ExceptionInfo,
:stacktrace
["--> util.schema$schema_core_validator$fn__17782.invoke(schema.clj:29)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals_STAR_.invokeStatic(wrap_value_literals.clj:137)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals_STAR_.invoke(wrap_value_literals.clj:133)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40651.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40536.invoke(annotate.clj:578)"
"query_processor.middleware.permissions$check_query_permissions$fn__46234.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47767.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46432.invoke(cumulative_aggregations.clj:60)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:327)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__48004.invoke(resolve_joined_fields.clj:35)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__48323.invoke(resolve_joins.clj:184)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44755.invoke(add_implicit_joins.clj:249)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47044.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__47024.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__46498.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45516.invoke(binning.clj:225)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46034.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44303.invoke(add_dimension_projections.clj:315)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44506.invoke(add_implicit_clauses.clj:138)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:327)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44904.invoke(add_source_metadata.clj:103)"
"metabase_enterprise.sandbox.query_processor.middleware.column_level_perms_check$maybe_apply_column_level_perms_check$fn__48737.invoke(column_level_perms_check.clj:25)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47964.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45104.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46081.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__47749.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46133.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46754.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44913.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__48685.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975$fn__47979.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46972.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__48694$fn__48695.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:42)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__48694.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48703.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47096.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44773.invoke(add_rows_truncated.clj:35)"
"metabase_enterprise.audit.query_processor.middleware.handle_audit_queries$handle_internal_queries$fn__31273.invoke(handle_audit_queries.clj:162)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__48670.invoke(results_metadata.clj:146)"
"query_processor.reducible$async_qp$qp_STAR___33081$thunk__33082.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___33081.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___33090$fn__33093.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___33090.invoke(reducible.clj:134)"
"query_processor$preprocess_query.invokeStatic(query_processor.clj:162)"
"query_processor$preprocess_query.invoke(query_processor.clj:154)"
"query_processor$query__GT_preprocessed.invokeStatic(query_processor.clj:168)"
"query_processor$query__GT_preprocessed.invoke(query_processor.clj:164)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952$fn__48953$fn__48954.invoke(row_level_restrictions.clj:135)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952$fn__48953.invoke(row_level_restrictions.clj:134)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__48947$preprocess_source_query__48952.invoke(row_level_restrictions.clj:129)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__49051$gtap__GT_source__49056$fn__49060.invoke(row_level_restrictions.clj:207)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$fn__49051$gtap__GT_source__49056.invoke(row_level_restrictions.clj:195)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtap.invokeStatic(row_level_restrictions.clj:258)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtap.invoke(row_level_restrictions.clj:244)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps$replace_49108__49109.invoke(row_level_restrictions.clj:268)"
"mbql.util.match$replace_in_collection$iter__19339__19343$fn__19344.invoke(match.clj:139)"
"mbql.util.match$replace_in_collection.invokeStatic(match.clj:138)"
"mbql.util.match$replace_in_collection.invoke(match.clj:133)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps$replace_49108__49109.invoke(row_level_restrictions.clj:268)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps.invokeStatic(row_level_restrictions.clj:268)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_gtaps.invoke(row_level_restrictions.clj:263)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$gtapped_query.invokeStatic(row_level_restrictions.clj:310)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$gtapped_query.invoke(row_level_restrictions.clj:307)"
"metabase_enterprise.sandbox.query_processor.middleware.row_level_restrictions$apply_row_level_permissions$fn__49135.invoke(row_level_restrictions.clj:321)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44904.invoke(add_source_metadata.clj:103)"
"metabase_enterprise.sandbox.query_processor.middleware.column_level_perms_check$maybe_apply_column_level_perms_check$fn__48737.invoke(column_level_perms_check.clj:25)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47964.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45104.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46081.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__47749.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46133.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46754.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44913.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__48685.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975$fn__47979.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47975.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46972.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__48694$fn__48695.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__48694.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48703.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47096.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44773.invoke(add_rows_truncated.clj:35)"
"metabase_enterprise.audit.query_processor.middleware.handle_audit_queries$handle_internal_queries$fn__31273.invoke(handle_audit_queries.clj:162)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__48670.invoke(results_metadata.clj:146)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46375.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47838.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46318.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___33081$thunk__33082.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___33081.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___33090$fn__33093.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___33090.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:235)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:231)"
"query_processor$fn__49181$process_query_and_save_execution_BANG___49190$fn__49193.invoke(query_processor.clj:247)"
"query_processor$fn__49181$process_query_and_save_execution_BANG___49190.invoke(query_processor.clj:239)"
"query_processor$fn__49225$process_query_and_save_with_max_results_constraints_BANG___49234$fn__49237.invoke(query_processor.clj:259)"
"query_processor$fn__49225$process_query_and_save_with_max_results_constraints_BANG___49234.invoke(query_processor.clj:252)"
"api.dataset$fn__63122$fn__63125.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__63103$fn__63104.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__63103.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__17489.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "Value does not match schema: {:query {:fields (named (not (\"distinct\" a-clojure.lang.PersistentVector)) \"Distinct, non-empty sequence of Field clauses\")}}",
:row_count 0,
:running_time 0,
:preprocessed nil,
:ex-data
{:type :schema.core/error,
:value
{:database 1,
:type :query,
:query
{:source-metadata
[{:name "ID", :id 36, :table_id 4, :display_name "ID", :base_type :type/BigInteger, :special_type :type/PK, :fingerprint nil, :settings nil}
{:name "PRODUCT_ID", :id 33, :table_id 4, :display_name "Product ID", :base_type :type/Integer, :special_type :type/FK, :fingerprint {:global {:distinct-count 176, :nil% 0.0}}, :settings nil}
{:name "REVIEWER",
:id 35,
:table_id 4,
:display_name "Reviewer",
:base_type :type/Text,
:special_type nil,
:fingerprint
{:global {:distinct-count 1076, :nil% 0.0},
:type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.001798561151079137, :average-length 9.972122302158274}}},
:settings nil}
{:name "RATING",
:id 31,
:table_id 4,
:display_name "Rating",
:base_type :type/Integer,
:special_type :type/Score,
:fingerprint {:global {:distinct-count 5, :nil% 0.0}, :type {:type/Number {:min 1.0, :q1 3.54744353181696, :q3 4.764807071650455, :max 5.0, :sd 1.0443899855660577, :avg 3.987410071942446}}},
:settings nil}
{:name "BODY",
:id 32,
:table_id 4,
:display_name "Body",
:base_type :type/Text,
:special_type :type/Description,
:fingerprint {:global {:distinct-count 1112, :nil% 0.0}, :type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.0, :average-length 177.41996402877697}}},
:settings nil}
{:table_id 4,
:special_type :type/CreationTimestamp,
:unit :default,
:name "CREATED_AT",
:settings nil,
:id 34,
:display_name "Created At",
:fingerprint {:global {:distinct-count 1112, :nil% 0.0}, :type {:type/DateTime {:earliest "2016-06-03T00:37:05.818Z", :latest "2020-04-19T14:15:25.677Z"}}},
:base_type :type/DateTime}
{:table_id 1,
:special_type :type/Title,
:name "TITLE",
:settings nil,
:id 5,
:display_name "Product → Title",
:fingerprint {:global {:distinct-count 199, :nil% 0.0}, :type {:type/Text {:percent-json 0.0, :percent-url 0.0, :percent-email 0.0, :percent-state 0.0, :average-length 21.495}}},
:base_type :type/Text,
:source_alias "PRODUCTS__via__PRODUCT_ID"}],
:fields
[[:field-id 36]
[:field-id 33]
[:field-id 35]
[:field-id 31]
[:field-id 32]
[:field-id 34]
[:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]
[:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]],
:joins [{:strategy :left-join, :source-table 1, :alias "PRODUCTS__via__PRODUCT_ID", :fk-field-id 33, :condition [:= [:field-id 33] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 8]]]}],
:source-query
{:source-table 4,
:filter [:= [:field-id 33] [:value 1 {:base_type :type/Integer, :special_type :type/FK, :database_type "INTEGER", :name "PRODUCT_ID"}]],
:fields [[:field-id 36] [:field-id 33] [:field-id 35] [:field-id 31] [:field-id 32] [:datetime-field [:field-id 34] :default] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 5]]],
:joins
[{:strategy :left-join, :source-table 1, :alias "PRODUCTS__via__PRODUCT_ID", :fk-field-id 33, :condition [:= [:field-id 33] [:joined-field "PRODUCTS__via__PRODUCT_ID" [:field-id 8]]]}]}}},
:error {:query {:fields (named (not ("distinct" a-clojure.lang.PersistentVector)) "Distinct, non-empty sequence of Field clauses")}}},
:data {:rows [], :cols []}}
```
</details>
**Information about your Metabase Installation:**
Hash `6312927` ~1.38.0-rc4
**Additional context**
Related to https://github.com/metabase/metabase-enterprise/issues/405
|
process
|
sandboxing on tables with remapped fk display values causes query to fail describe the bug when using column sandboxing on a table with remapped fk display values then the sandboxed query fails on this works in to reproduce admin data model sample dataset reviews product id gear display values use foreign key products title admin people create user and set attributes user id admin permissions revoke all collection and data permissions and grant full access to products and sandbox to reviews login as user and go to reviews table which fails with value does not match schema query fields named not distinct a clojure lang persistentvector distinct non empty sequence of field clauses same instance just running works full stacktrace error middleware catch exceptions error processing query null database id started at t error type invalid query json query database query source table type query parameters middleware js int to string true add default userland constraints true native nil status failed class clojure lang exceptioninfo stacktrace util schema schema core validator fn invoke schema clj query processor middleware wrap value literals wrap value literals star invokestatic wrap value literals clj query processor middleware wrap value literals wrap value literals star invoke wrap value literals clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj metabase enterprise sandbox query processor middleware row level restrictions apply row level permissions fn invoke row level restrictions clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj metabase enterprise sandbox query processor middleware row level restrictions apply row level permissions fn invoke row level restrictions clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj metabase enterprise sandbox query processor middleware column level perms check maybe apply column level perms check fn invoke column level perms check clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj metabase enterprise audit query processor middleware handle audit queries handle internal queries fn invoke handle audit queries clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor preprocess query invokestatic query processor clj query processor preprocess query invoke query processor clj query processor query gt preprocessed invokestatic query processor clj query processor query gt preprocessed invoke query processor clj metabase enterprise sandbox query processor middleware row level restrictions fn preprocess source query fn fn invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions fn preprocess source query fn invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions fn preprocess source query invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions fn gtap gt source fn invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions fn gtap gt source invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply gtap invokestatic row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply gtap invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply gtaps replace invoke row level restrictions clj mbql util match replace in collection iter fn invoke match clj mbql util match replace in collection invokestatic match clj mbql util match replace in collection invoke match clj metabase enterprise sandbox query processor middleware row level restrictions apply gtaps replace invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply gtaps invokestatic row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply gtaps invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions gtapped query invokestatic row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions gtapped query invoke row level restrictions clj metabase enterprise sandbox query processor middleware row level restrictions apply row level permissions fn invoke row level restrictions clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj metabase enterprise sandbox query processor middleware column level perms check maybe apply column level perms check fn invoke column level perms check clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj metabase enterprise audit query processor middleware handle audit queries handle internal queries fn invoke handle audit queries clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset fn fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error value does not match schema query fields named not distinct a clojure lang persistentvector distinct non empty sequence of field clauses row count running time preprocessed nil ex data type schema core error value database type query query source metadata name id id table id display name id base type type biginteger special type type pk fingerprint nil settings nil name product id id table id display name product id base type type integer special type type fk fingerprint global distinct count nil settings nil name reviewer id table id display name reviewer base type type text special type nil fingerprint global distinct count nil type type text percent json percent url percent email percent state average length settings nil name rating id table id display name rating base type type integer special type type score fingerprint global distinct count nil type type number min max sd avg settings nil name body id table id display name body base type type text special type type description fingerprint global distinct count nil type type text percent json percent url percent email percent state average length settings nil table id special type type creationtimestamp unit default name created at settings nil id display name created at fingerprint global distinct count nil type type datetime earliest latest base type type datetime table id special type type title name title settings nil id display name product → title fingerprint global distinct count nil type type text percent json percent url percent email percent state average length base type type text source alias products via product id fields joins source query source table filter fields default joins error query fields named not distinct a clojure lang persistentvector distinct non empty sequence of field clauses data rows cols information about your metabase installation hash additional context related to
| 1
|
16,583
| 21,626,639,997
|
IssuesEvent
|
2022-05-05 03:41:42
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
Multiple triggered interrupting boundary events can deadlock process instance
|
kind/bug scope/broker area/reliability team/process-automation
|
**Describe the bug**
<!-- A clear and concise description of what the bug is. -->
When multiple interrupting boundary events are triggered simultaneously for a process instance, the process instance may not be able to finish terminating. Such an instance can no longer be canceled from outside either.
This was discovered in a scenario with a parent process that calls a child process that in turn calls another child process (see To-Reproduce section). Each call activity has an interrupting message boundary event that subscribes to the same message (i.e. the same message name and correlation key). When the message is published both call activities are simultaneously interrupted and terminated. However, this can lead to a deadlock in the termination logic:
The child cannot complete the message boundary event and take the sequence flow because its flow scope (the called instance) is set to terminating by the other boundary event. But the called instance cannot terminate because there is still an active flow.
Note that there may exist other ways to hit this bug using other events than messages (likely) and perhaps it also exists with nested embedded subprocesses (unlikely). I have not tested these cases.
**To Reproduce**
<!--
Steps to reproduce the behavior
If possible add a minimal reproducer code sample
- when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java
-->
- Deploy these 3 processes: [Processes.zip](https://github.com/camunda/zeebe/files/8563346/Processes.zip)
- Create an instance `zbctl create instance Level_1`
- Wait for the user task to be active (we just need some wait state)
- Publish a message that correlates to both the interrupting message boundary events (one in Level_1 and one in Level_2): `zbctl --insecure publish message msg --correlationKey='msg'`
- Check the log for rejected complete_element command: `zdb log print -p=/tmp/data/raft-partition/partitions/1/ | jq '.records[].entries[]? | select(.recordType == "COMMAND_REJECTION") | select(.intent == "COMPLETE_ELEMENT")'`
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
The process instance should terminate.
**Log/Stacktrace**
<!-- If possible add the full stacktrace or Zeebe log which contains the issue. -->
<details><summary>Full Stacktrace</summary>
<p>
```
{
"partitionId": 1,
"value": {
"version": 1,
"parentElementInstanceKey": 2251799813685394,
"parentProcessInstanceKey": 2251799813685390,
"processDefinitionKey": 2251799813685387,
"elementId": "Event_1pi5foh",
"bpmnProcessId": "Level_2",
"processInstanceKey": 2251799813685396,
"flowScopeKey": 2251799813685396,
"bpmnElementType": "BOUNDARY_EVENT"
},
"key": 2251799813685411,
"timestamp": 1650977539258,
"intent": "COMPLETE_ELEMENT",
"position": 534,
"valueType": "PROCESS_INSTANCE",
"recordType": "COMMAND_REJECTION",
"rejectionType": "INVALID_STATE",
"rejectionReason": "Expected flow scope instance to be in state 'ELEMENT_ACTIVATED' but was 'ELEMENT_TERMINATING'.",
"brokerVersion": "1.3.6",
"sourceRecordPosition": 532
}
```
</p>
</details>
**Environment:**
- Zeebe Version: 1.3.6 (untested in newer versions)
|
1.0
|
Multiple triggered interrupting boundary events can deadlock process instance - **Describe the bug**
<!-- A clear and concise description of what the bug is. -->
When multiple interrupting boundary events are triggered simultaneously for a process instance, the process instance may not be able to finish terminating. Such an instance can no longer be canceled from outside either.
This was discovered in a scenario with a parent process that calls a child process that in turn calls another child process (see To-Reproduce section). Each call activity has an interrupting message boundary event that subscribes to the same message (i.e. the same message name and correlation key). When the message is published both call activities are simultaneously interrupted and terminated. However, this can lead to a deadlock in the termination logic:
The child cannot complete the message boundary event and take the sequence flow because its flow scope (the called instance) is set to terminating by the other boundary event. But the called instance cannot terminate because there is still an active flow.
Note that there may exist other ways to hit this bug using other events than messages (likely) and perhaps it also exists with nested embedded subprocesses (unlikely). I have not tested these cases.
**To Reproduce**
<!--
Steps to reproduce the behavior
If possible add a minimal reproducer code sample
- when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java
-->
- Deploy these 3 processes: [Processes.zip](https://github.com/camunda/zeebe/files/8563346/Processes.zip)
- Create an instance `zbctl create instance Level_1`
- Wait for the user task to be active (we just need some wait state)
- Publish a message that correlates to both the interrupting message boundary events (one in Level_1 and one in Level_2): `zbctl --insecure publish message msg --correlationKey='msg'`
- Check the log for rejected complete_element command: `zdb log print -p=/tmp/data/raft-partition/partitions/1/ | jq '.records[].entries[]? | select(.recordType == "COMMAND_REJECTION") | select(.intent == "COMPLETE_ELEMENT")'`
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
The process instance should terminate.
**Log/Stacktrace**
<!-- If possible add the full stacktrace or Zeebe log which contains the issue. -->
<details><summary>Full Stacktrace</summary>
<p>
```
{
"partitionId": 1,
"value": {
"version": 1,
"parentElementInstanceKey": 2251799813685394,
"parentProcessInstanceKey": 2251799813685390,
"processDefinitionKey": 2251799813685387,
"elementId": "Event_1pi5foh",
"bpmnProcessId": "Level_2",
"processInstanceKey": 2251799813685396,
"flowScopeKey": 2251799813685396,
"bpmnElementType": "BOUNDARY_EVENT"
},
"key": 2251799813685411,
"timestamp": 1650977539258,
"intent": "COMPLETE_ELEMENT",
"position": 534,
"valueType": "PROCESS_INSTANCE",
"recordType": "COMMAND_REJECTION",
"rejectionType": "INVALID_STATE",
"rejectionReason": "Expected flow scope instance to be in state 'ELEMENT_ACTIVATED' but was 'ELEMENT_TERMINATING'.",
"brokerVersion": "1.3.6",
"sourceRecordPosition": 532
}
```
</p>
</details>
**Environment:**
- Zeebe Version: 1.3.6 (untested in newer versions)
|
process
|
multiple triggered interrupting boundary events can deadlock process instance describe the bug when multiple interrupting boundary events are triggered simultaneously for a process instance the process instance may not be able to finish terminating such an instance can no longer be canceled from outside either this was discovered in a scenario with a parent process that calls a child process that in turn calls another child process see to reproduce section each call activity has an interrupting message boundary event that subscribes to the same message i e the same message name and correlation key when the message is published both call activities are simultaneously interrupted and terminated however this can lead to a deadlock in the termination logic the child cannot complete the message boundary event and take the sequence flow because its flow scope the called instance is set to terminating by the other boundary event but the called instance cannot terminate because there is still an active flow note that there may exist other ways to hit this bug using other events than messages likely and perhaps it also exists with nested embedded subprocesses unlikely i have not tested these cases to reproduce steps to reproduce the behavior if possible add a minimal reproducer code sample when using the java client deploy these processes create an instance zbctl create instance level wait for the user task to be active we just need some wait state publish a message that correlates to both the interrupting message boundary events one in level and one in level zbctl insecure publish message msg correlationkey msg check the log for rejected complete element command zdb log print p tmp data raft partition partitions jq records entries select recordtype command rejection select intent complete element expected behavior the process instance should terminate log stacktrace full stacktrace partitionid value version parentelementinstancekey parentprocessinstancekey processdefinitionkey elementid event bpmnprocessid level processinstancekey flowscopekey bpmnelementtype boundary event key timestamp intent complete element position valuetype process instance recordtype command rejection rejectiontype invalid state rejectionreason expected flow scope instance to be in state element activated but was element terminating brokerversion sourcerecordposition environment zeebe version untested in newer versions
| 1
|
256,981
| 19,483,891,121
|
IssuesEvent
|
2021-12-26 00:23:13
|
KinsonDigital/Velaptor
|
https://api.github.com/repos/KinsonDigital/Velaptor
|
opened
|
🚧Update copyright in project file to 2022
|
documentation good first issue high priority preview
|
### I have done the items below . . .
- [X] I have updated the title by replacing the '**_<title_**>' section.
### Description
Update the copyright in the project file from `2020` to `2022`
### Acceptance Criteria
**This issue is finished when:**
- [ ] Code documentation added if required
~Unit tests added~
~All unit tests pass~
### ToDo Items
- [ ] Draft pull request created and linked to this issue
- [X] Priority label added to issue (**_low priority_**, **_medium priority_**, or **_high priority_**)
- [X] Issue linked to the proper project
- [X] Issue linked to proper milestone
- [ ] Unit tests have been written and/or adjusted for code additions or changes
- [ ] All unit tests pass
### Issue Dependencies
_No response_
### Related Work
_No response_
|
1.0
|
🚧Update copyright in project file to 2022 - ### I have done the items below . . .
- [X] I have updated the title by replacing the '**_<title_**>' section.
### Description
Update the copyright in the project file from `2020` to `2022`
### Acceptance Criteria
**This issue is finished when:**
- [ ] Code documentation added if required
~Unit tests added~
~All unit tests pass~
### ToDo Items
- [ ] Draft pull request created and linked to this issue
- [X] Priority label added to issue (**_low priority_**, **_medium priority_**, or **_high priority_**)
- [X] Issue linked to the proper project
- [X] Issue linked to proper milestone
- [ ] Unit tests have been written and/or adjusted for code additions or changes
- [ ] All unit tests pass
### Issue Dependencies
_No response_
### Related Work
_No response_
|
non_process
|
🚧update copyright in project file to i have done the items below i have updated the title by replacing the section description update the copyright in the project file from to acceptance criteria this issue is finished when code documentation added if required unit tests added all unit tests pass todo items draft pull request created and linked to this issue priority label added to issue low priority medium priority or high priority issue linked to the proper project issue linked to proper milestone unit tests have been written and or adjusted for code additions or changes all unit tests pass issue dependencies no response related work no response
| 0
|
665
| 3,135,929,990
|
IssuesEvent
|
2015-09-10 17:27:43
|
wekan/wekan
|
https://api.github.com/repos/wekan/wekan
|
closed
|
Release v0.9
|
Meta:Release-process
|
I've released the [first release candidate](https://github.com/wekan/wekan/releases/tag/v0.9.0-rc1) of Wekan v0.9:
* [Release notes](https://github.com/wekan/wekan/blob/master/History.md#next--v09)
* [Official docker image](https://hub.docker.com/r/mquandalle/wekan/)
* [Official sandstorm package](https://apps.sandstorm.io/app/m86q05rdvj14yvn78ghaxynqz7u2svw6rnttptxx49g1785cdv1h)
Go download and test it! Bugs that I plan to fix before the final v0.9 release are tracked using the corresponding [GitHub milestone](https://github.com/wekan/wekan/milestones/Release%200.9).
As it introduces a new user interface, this release changes quite a bit of displayed text, and thus requires some translation work. If you are interested to contribute to it you can create or join a team on [our Transifex project](https://www.transifex.com/wekan/wekan/).
I'll also set up a discourse forum (or do you prefer something else?) for the final v0.9.
|
1.0
|
Release v0.9 - I've released the [first release candidate](https://github.com/wekan/wekan/releases/tag/v0.9.0-rc1) of Wekan v0.9:
* [Release notes](https://github.com/wekan/wekan/blob/master/History.md#next--v09)
* [Official docker image](https://hub.docker.com/r/mquandalle/wekan/)
* [Official sandstorm package](https://apps.sandstorm.io/app/m86q05rdvj14yvn78ghaxynqz7u2svw6rnttptxx49g1785cdv1h)
Go download and test it! Bugs that I plan to fix before the final v0.9 release are tracked using the corresponding [GitHub milestone](https://github.com/wekan/wekan/milestones/Release%200.9).
As it introduces a new user interface, this release changes quite a bit of displayed text, and thus requires some translation work. If you are interested to contribute to it you can create or join a team on [our Transifex project](https://www.transifex.com/wekan/wekan/).
I'll also set up a discourse forum (or do you prefer something else?) for the final v0.9.
|
process
|
release i ve released the of wekan go download and test it bugs that i plan to fix before the final release are tracked using the corresponding as it introduces a new user interface this release changes quite a bit of displayed text and thus requires some translation work if you are interested to contribute to it you can create or join a team on i ll also set up a discourse forum or do you prefer something else for the final
| 1
|
15,684
| 19,847,823,638
|
IssuesEvent
|
2022-01-21 08:55:08
|
ooi-data/CE04OSSM-RID26-07-NUTNRB000-recovered_inst-suna_instrument_recovered
|
https://api.github.com/repos/ooi-data/CE04OSSM-RID26-07-NUTNRB000-recovered_inst-suna_instrument_recovered
|
opened
|
🛑 Processing failed: ValueError
|
process
|
## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T08:55:07.736867.
## Details
Flow name: `CE04OSSM-RID26-07-NUTNRB000-recovered_inst-suna_instrument_recovered`
Task name: `processing_task`
Error type: `ValueError`
Error message: not enough values to unpack (expected 3, got 0)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values
return _as_array_or_item(self._data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item
data = np.asarray(data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__
x = self.compute()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute
(result,) = compute(self, traverse=False, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute
results = schedule(dsk, keys, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get
results = get_async(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async
raise_exception(exc, tb)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise
raise exc
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task
result = _execute_task(task, data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task
return func(*(_execute_task(a, cache) for a in args))
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter
c = np.asarray(c)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__
return np.asarray(self.array, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__
self._ensure_cached()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached
self.array = NumpyIndexingAdapter(np.asarray(self.array))
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__
return np.asarray(self.array, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__
return self.func(self.array)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask
data = np.asarray(data, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__
return array[key.tuple]
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__
return self.get_basic_selection(selection, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection
return self._get_basic_selection_nd(selection=selection, out=out,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd
return self._get_selection(indexer=indexer, out=out, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection
lchunk_coords, lchunk_selection, lout_selection = zip(*indexer)
ValueError: not enough values to unpack (expected 3, got 0)
```
</details>
|
1.0
|
🛑 Processing failed: ValueError - ## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T08:55:07.736867.
## Details
Flow name: `CE04OSSM-RID26-07-NUTNRB000-recovered_inst-suna_instrument_recovered`
Task name: `processing_task`
Error type: `ValueError`
Error message: not enough values to unpack (expected 3, got 0)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values
return _as_array_or_item(self._data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item
data = np.asarray(data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__
x = self.compute()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute
(result,) = compute(self, traverse=False, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute
results = schedule(dsk, keys, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get
results = get_async(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async
raise_exception(exc, tb)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise
raise exc
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task
result = _execute_task(task, data)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task
return func(*(_execute_task(a, cache) for a in args))
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter
c = np.asarray(c)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__
return np.asarray(self.array, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__
self._ensure_cached()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached
self.array = NumpyIndexingAdapter(np.asarray(self.array))
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__
return np.asarray(self.array, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__
return self.func(self.array)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask
data = np.asarray(data, dtype=dtype)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__
return array[key.tuple]
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__
return self.get_basic_selection(selection, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection
return self._get_basic_selection_nd(selection=selection, out=out,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd
return self._get_selection(indexer=indexer, out=out, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection
lchunk_coords, lchunk_selection, lout_selection = zip(*indexer)
ValueError: not enough values to unpack (expected 3, got 0)
```
</details>
|
process
|
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name recovered inst suna instrument recovered task name processing task error type valueerror error message not enough values to unpack expected got traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages xarray core variable py line in values return as array or item self data file srv conda envs notebook lib site packages xarray core variable py line in as array or item data np asarray data file srv conda envs notebook lib site packages dask array core py line in array x self compute file srv conda envs notebook lib site packages dask base py line in compute result compute self traverse false kwargs file srv conda envs notebook lib site packages dask base py line in compute results schedule dsk keys kwargs file srv conda envs notebook lib site packages dask threaded py line in get results get async file srv conda envs notebook lib site packages dask local py line in get async raise exception exc tb file srv conda envs notebook lib site packages dask local py line in reraise raise exc file srv conda envs notebook lib site packages dask local py line in execute task result execute task task data file srv conda envs notebook lib site packages dask core py line in execute task return func execute task a cache for a in args file srv conda envs notebook lib site packages dask array core py line in getter c np asarray c file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array self ensure cached file srv conda envs notebook lib site packages xarray core indexing py line in ensure cached self array numpyindexingadapter np asarray self array file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray coding variables py line in array return self func self array file srv conda envs notebook lib site packages xarray coding variables py line in apply mask data np asarray data dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray backends zarr py line in getitem return array file srv conda envs notebook lib site packages zarr core py line in getitem return self get basic selection selection fields fields file srv conda envs notebook lib site packages zarr core py line in get basic selection return self get basic selection nd selection selection out out file srv conda envs notebook lib site packages zarr core py line in get basic selection nd return self get selection indexer indexer out out fields fields file srv conda envs notebook lib site packages zarr core py line in get selection lchunk coords lchunk selection lout selection zip indexer valueerror not enough values to unpack expected got
| 1
|
7,718
| 10,822,041,060
|
IssuesEvent
|
2019-11-08 20:11:17
|
bazelbuild/vscode-bazel
|
https://api.github.com/repos/bazelbuild/vscode-bazel
|
closed
|
`npm outdated`: Unpin/updated some deps?
|
type: process
|
Current setup seems to have a few things listed:
`npm outdated` -
```
Package Current Wanted Latest Location
@types/node 6.14.2 6.14.4 11.13.4 vscode-bazel
tslint 5.11.0 5.15.0 5.15.0 vscode-bazel
typescript 3.1.6 3.4.3 3.4.3 vscode-bazel
vscode-debugadapter 1.32.1 1.34.0 1.34.0 vscode-bazel
vscode-debugprotocol 1.32.0 1.34.0 1.34.0 vscode-bazel
```
|
1.0
|
`npm outdated`: Unpin/updated some deps? - Current setup seems to have a few things listed:
`npm outdated` -
```
Package Current Wanted Latest Location
@types/node 6.14.2 6.14.4 11.13.4 vscode-bazel
tslint 5.11.0 5.15.0 5.15.0 vscode-bazel
typescript 3.1.6 3.4.3 3.4.3 vscode-bazel
vscode-debugadapter 1.32.1 1.34.0 1.34.0 vscode-bazel
vscode-debugprotocol 1.32.0 1.34.0 1.34.0 vscode-bazel
```
|
process
|
npm outdated unpin updated some deps current setup seems to have a few things listed npm outdated package current wanted latest location types node vscode bazel tslint vscode bazel typescript vscode bazel vscode debugadapter vscode bazel vscode debugprotocol vscode bazel
| 1
|
4,470
| 7,333,611,953
|
IssuesEvent
|
2018-03-05 19:58:10
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
metadata.json contains vmware_desktop instead of vmware_fusion
|
bug good first issue post-processor/vagrant
|
Version: 1.2.0
Platform: OS X 10.12.6
Using the templates found here https://github.com/boxcutter/windows a vmware box built on OSX will have `{"provider":"vmware_desktop"}` in its metadata.json instead of `vmware_fusion` which makes vagrant barf when you try to bring up the box with `vmware_fusion` as the provider.
It appears `provider` is hardcoded to be `vmware_desktop` irrespective of the platform on which the vmware box was built here https://github.com/hashicorp/packer/blob/ebe995c0ff49d479f41b60f977af724eee54690e/post-processor/vagrant/vmware.go#L18
|
1.0
|
metadata.json contains vmware_desktop instead of vmware_fusion - Version: 1.2.0
Platform: OS X 10.12.6
Using the templates found here https://github.com/boxcutter/windows a vmware box built on OSX will have `{"provider":"vmware_desktop"}` in its metadata.json instead of `vmware_fusion` which makes vagrant barf when you try to bring up the box with `vmware_fusion` as the provider.
It appears `provider` is hardcoded to be `vmware_desktop` irrespective of the platform on which the vmware box was built here https://github.com/hashicorp/packer/blob/ebe995c0ff49d479f41b60f977af724eee54690e/post-processor/vagrant/vmware.go#L18
|
process
|
metadata json contains vmware desktop instead of vmware fusion version platform os x using the templates found here a vmware box built on osx will have provider vmware desktop in its metadata json instead of vmware fusion which makes vagrant barf when you try to bring up the box with vmware fusion as the provider it appears provider is hardcoded to be vmware desktop irrespective of the platform on which the vmware box was built here
| 1
|
166,375
| 20,718,465,481
|
IssuesEvent
|
2022-03-13 01:48:33
|
ghc-dev/Randy-Weber
|
https://api.github.com/repos/ghc-dev/Randy-Weber
|
opened
|
nancy.1.4.3.nupkg: 1 vulnerabilities (highest severity is: 9.8)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nancy.1.4.3.nupkg</b></p></summary>
<p>Nancy is a lightweight web framework for the .Net platform, inspired by Sinatra. Nancy aim at delive...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/nancy.1.4.3.nupkg">https://api.nuget.org/packages/nancy.1.4.3.nupkg</a></p>
<p>Path to dependency file: /ConsoleApp1.csproj</p>
<p>Path to vulnerable library: /t/packages/nancy/1.4.3/nancy.1.4.3.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Randy-Weber/commit/f90e68f41f8428d428b86e00275b4ca939882445">f90e68f41f8428d428b86e00275b4ca939882445</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2017-9785](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | nancy.1.4.3.nupkg | Direct | Nancy - 1.4.4,2.0 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-9785</summary>
### Vulnerable Library - <b>nancy.1.4.3.nupkg</b></p>
<p>Nancy is a lightweight web framework for the .Net platform, inspired by Sinatra. Nancy aim at delive...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/nancy.1.4.3.nupkg">https://api.nuget.org/packages/nancy.1.4.3.nupkg</a></p>
<p>Path to dependency file: /ConsoleApp1.csproj</p>
<p>Path to vulnerable library: /t/packages/nancy/1.4.3/nancy.1.4.3.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **nancy.1.4.3.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Randy-Weber/commit/f90e68f41f8428d428b86e00275b4ca939882445">f90e68f41f8428d428b86e00275b4ca939882445</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Csrf.cs in NancyFX Nancy before 1.4.4 and 2.x before 2.0-dangermouse has Remote Code Execution via Deserialization of JSON data in a CSRF Cookie.
<p>Publish Date: 2017-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785>CVE-2017-9785</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-9785">http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-9785</a></p>
<p>Release Date: 2017-07-20</p>
<p>Fix Resolution: Nancy - 1.4.4,2.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Nancy","packageVersion":"1.4.3","packageFilePaths":["/ConsoleApp1.csproj"],"isTransitiveDependency":false,"dependencyTree":"Nancy:1.4.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Nancy - 1.4.4,2.0","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-9785","vulnerabilityDetails":"Csrf.cs in NancyFX Nancy before 1.4.4 and 2.x before 2.0-dangermouse has Remote Code Execution via Deserialization of JSON data in a CSRF Cookie.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> -->
|
True
|
nancy.1.4.3.nupkg: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nancy.1.4.3.nupkg</b></p></summary>
<p>Nancy is a lightweight web framework for the .Net platform, inspired by Sinatra. Nancy aim at delive...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/nancy.1.4.3.nupkg">https://api.nuget.org/packages/nancy.1.4.3.nupkg</a></p>
<p>Path to dependency file: /ConsoleApp1.csproj</p>
<p>Path to vulnerable library: /t/packages/nancy/1.4.3/nancy.1.4.3.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Randy-Weber/commit/f90e68f41f8428d428b86e00275b4ca939882445">f90e68f41f8428d428b86e00275b4ca939882445</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2017-9785](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | nancy.1.4.3.nupkg | Direct | Nancy - 1.4.4,2.0 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-9785</summary>
### Vulnerable Library - <b>nancy.1.4.3.nupkg</b></p>
<p>Nancy is a lightweight web framework for the .Net platform, inspired by Sinatra. Nancy aim at delive...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/nancy.1.4.3.nupkg">https://api.nuget.org/packages/nancy.1.4.3.nupkg</a></p>
<p>Path to dependency file: /ConsoleApp1.csproj</p>
<p>Path to vulnerable library: /t/packages/nancy/1.4.3/nancy.1.4.3.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **nancy.1.4.3.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Randy-Weber/commit/f90e68f41f8428d428b86e00275b4ca939882445">f90e68f41f8428d428b86e00275b4ca939882445</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Csrf.cs in NancyFX Nancy before 1.4.4 and 2.x before 2.0-dangermouse has Remote Code Execution via Deserialization of JSON data in a CSRF Cookie.
<p>Publish Date: 2017-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785>CVE-2017-9785</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-9785">http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-9785</a></p>
<p>Release Date: 2017-07-20</p>
<p>Fix Resolution: Nancy - 1.4.4,2.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Nancy","packageVersion":"1.4.3","packageFilePaths":["/ConsoleApp1.csproj"],"isTransitiveDependency":false,"dependencyTree":"Nancy:1.4.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Nancy - 1.4.4,2.0","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-9785","vulnerabilityDetails":"Csrf.cs in NancyFX Nancy before 1.4.4 and 2.x before 2.0-dangermouse has Remote Code Execution via Deserialization of JSON data in a CSRF Cookie.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-9785","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> -->
|
non_process
|
nancy nupkg vulnerabilities highest severity is vulnerable library nancy nupkg nancy is a lightweight web framework for the net platform inspired by sinatra nancy aim at delive library home page a href path to dependency file csproj path to vulnerable library t packages nancy nancy nupkg found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high nancy nupkg direct nancy details cve vulnerable library nancy nupkg nancy is a lightweight web framework for the net platform inspired by sinatra nancy aim at delive library home page a href path to dependency file csproj path to vulnerable library t packages nancy nancy nupkg dependency hierarchy x nancy nupkg vulnerable library found in head commit a href found in base branch main vulnerability details csrf cs in nancyfx nancy before and x before dangermouse has remote code execution via deserialization of json data in a csrf cookie publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution nancy rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue istransitivedependency false dependencytree nancy isminimumfixversionavailable true minimumfixversion nancy isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails csrf cs in nancyfx nancy before and x before dangermouse has remote code execution via deserialization of json data in a csrf cookie vulnerabilityurl
| 0
|
17,495
| 23,305,507,996
|
IssuesEvent
|
2022-08-07 23:50:04
|
lynnandtonic/nestflix.fun
|
https://api.github.com/repos/lynnandtonic/nestflix.fun
|
closed
|
Add Brazzos (2022) from "Only Murders in the Building" (Screenshots and Poster Added)
|
suggested title in process
|
Please add as much of the following info as you can:
Title: Brazzos (2022)
Type (film/tv show): TV series - detective drama
Film or show in which it appears: Only Murders in the Building
Is the parent film/show streaming anywhere? Yes - Hulu
About when in the parent film/show does it appear? Ep. 2x06 - "Performance Review"
Actual footage of the film/show can be seen (yes/no)? Yes
Timestamp:
- 7:12-8:10
Cast: Naomi Jackson, Charles-Haden Savage
Tagline: This sends the investigation in a whole new direction.








|
1.0
|
Add Brazzos (2022) from "Only Murders in the Building" (Screenshots and Poster Added) - Please add as much of the following info as you can:
Title: Brazzos (2022)
Type (film/tv show): TV series - detective drama
Film or show in which it appears: Only Murders in the Building
Is the parent film/show streaming anywhere? Yes - Hulu
About when in the parent film/show does it appear? Ep. 2x06 - "Performance Review"
Actual footage of the film/show can be seen (yes/no)? Yes
Timestamp:
- 7:12-8:10
Cast: Naomi Jackson, Charles-Haden Savage
Tagline: This sends the investigation in a whole new direction.








|
process
|
add brazzos from only murders in the building screenshots and poster added please add as much of the following info as you can title brazzos type film tv show tv series detective drama film or show in which it appears only murders in the building is the parent film show streaming anywhere yes hulu about when in the parent film show does it appear ep performance review actual footage of the film show can be seen yes no yes timestamp cast naomi jackson charles haden savage tagline this sends the investigation in a whole new direction
| 1
|
19,629
| 25,986,244,980
|
IssuesEvent
|
2022-12-20 00:40:25
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
ANALISTA DE SISTEMAS na [AVANSYS]
|
SALVADOR WORDPRESS HELP WANTED GEOPROCESSAMENTO ARCGIS JOOMLA Stale
|
<!--
==================================================
POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS!
Use: "Desenvolvedor Front-end" ao invés de
"Front-End Developer" \o/
Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]`
==================================================
-->
## Local
- Salvador
## Requisitos
**Obrigatórios:**
- Superior completo
- Experiência em manutenção, atualização de conteúdos de sistemas e gestão de conteúdos em Geoprocessamentos, ARCGIS, JOOMLA e Wordpress em ambientes similares
## Contratação
- a combinar
## Nossa empresa
- A Avansys Tecnologia presta efetivamente serviços especializados de desenvolvimento e manutenção de softwares utilizando sua Fabrica de Software CMMI3, Consultoria para todos os tipos de organização, serviço de Service Desk e Outsourcing para suprir todas as necessidades de sua organização.
## Como se candidatar
- Por favor envie um email para curriculo@avansys.com.br com seu CV anexado - enviar no assunto: ANALISTA DE SISTEMAS - 2020
|
1.0
|
ANALISTA DE SISTEMAS na [AVANSYS] - <!--
==================================================
POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS!
Use: "Desenvolvedor Front-end" ao invés de
"Front-End Developer" \o/
Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]`
==================================================
-->
## Local
- Salvador
## Requisitos
**Obrigatórios:**
- Superior completo
- Experiência em manutenção, atualização de conteúdos de sistemas e gestão de conteúdos em Geoprocessamentos, ARCGIS, JOOMLA e Wordpress em ambientes similares
## Contratação
- a combinar
## Nossa empresa
- A Avansys Tecnologia presta efetivamente serviços especializados de desenvolvimento e manutenção de softwares utilizando sua Fabrica de Software CMMI3, Consultoria para todos os tipos de organização, serviço de Service Desk e Outsourcing para suprir todas as necessidades de sua organização.
## Como se candidatar
- Por favor envie um email para curriculo@avansys.com.br com seu CV anexado - enviar no assunto: ANALISTA DE SISTEMAS - 2020
|
process
|
analista de sistemas na por favor só poste se a vaga for para salvador e cidades vizinhas use desenvolvedor front end ao invés de front end developer o exemplo desenvolvedor front end na local salvador requisitos obrigatórios superior completo experiência em manutenção atualização de conteúdos de sistemas e gestão de conteúdos em geoprocessamentos arcgis joomla e wordpress em ambientes similares contratação a combinar nossa empresa a avansys tecnologia presta efetivamente serviços especializados de desenvolvimento e manutenção de softwares utilizando sua fabrica de software consultoria para todos os tipos de organização serviço de service desk e outsourcing para suprir todas as necessidades de sua organização como se candidatar por favor envie um email para curriculo avansys com br com seu cv anexado enviar no assunto analista de sistemas
| 1
|
61,424
| 25,518,311,625
|
IssuesEvent
|
2022-11-28 18:10:51
|
cityofaustin/atd-data-tech
|
https://api.github.com/repos/cityofaustin/atd-data-tech
|
closed
|
[Enhancement] Show the dev environment alert banner in all environments except production
|
Service: Dev Type: Enhancement Product: Moped Type: Snackoo 🍫 Project: Moped v2.0
|
To stay on the safe side, we should always show the alert banner indicating when you're not in the production Moped environment.
[Discussion](https://austininnovation.slack.com/archives/CNUEPKLB1/p1668562704235619).
|
1.0
|
[Enhancement] Show the dev environment alert banner in all environments except production - To stay on the safe side, we should always show the alert banner indicating when you're not in the production Moped environment.
[Discussion](https://austininnovation.slack.com/archives/CNUEPKLB1/p1668562704235619).
|
non_process
|
show the dev environment alert banner in all environments except production to stay on the safe side we should always show the alert banner indicating when you re not in the production moped environment
| 0
|
13,658
| 16,373,579,537
|
IssuesEvent
|
2021-05-15 16:43:47
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
closed
|
Add /bin/false to netlify configuration
|
process
|
Currently Netlify routinely fails to build the site due to it thinking there are no changes to the website, even though we want it to build due to other underlying code or data changes. Apparently we can fix this by adding to the `netlify.toml` file:
```
[build]
ignore = "/bin/false"
```
|
1.0
|
Add /bin/false to netlify configuration - Currently Netlify routinely fails to build the site due to it thinking there are no changes to the website, even though we want it to build due to other underlying code or data changes. Apparently we can fix this by adding to the `netlify.toml` file:
```
[build]
ignore = "/bin/false"
```
|
process
|
add bin false to netlify configuration currently netlify routinely fails to build the site due to it thinking there are no changes to the website even though we want it to build due to other underlying code or data changes apparently we can fix this by adding to the netlify toml file ignore bin false
| 1
|
213,522
| 24,007,142,565
|
IssuesEvent
|
2022-09-14 15:36:13
|
centrica-engineering/muon
|
https://api.github.com/repos/centrica-engineering/muon
|
closed
|
CVE-2022-25858 (High) detected in terser-5.12.1.tgz, terser-4.8.0.tgz - autoclosed
|
security vulnerability
|
## CVE-2022-25858 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>terser-5.12.1.tgz</b>, <b>terser-4.8.0.tgz</b></p></summary>
<p>
<details><summary><b>terser-5.12.1.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit for ES6+</p>
<p>Library home page: <a href="https://registry.npmjs.org/terser/-/terser-5.12.1.tgz">https://registry.npmjs.org/terser/-/terser-5.12.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/terser/package.json</p>
<p>
Dependency Hierarchy:
- @muonic/muon-0.0.2-alpha.2.tgz (Root Library)
- building-rollup-2.0.1.tgz
- rollup-plugin-terser-7.0.2.tgz
- :x: **terser-5.12.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>terser-4.8.0.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit for ES6+</p>
<p>Library home page: <a href="https://registry.npmjs.org/terser/-/terser-4.8.0.tgz">https://registry.npmjs.org/terser/-/terser-4.8.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/babel-plugin-template-html-minifier/node_modules/terser/package.json,/node_modules/@open-wc/building-rollup/node_modules/terser/package.json,/node_modules/@open-wc/building-utils/node_modules/terser/package.json</p>
<p>
Dependency Hierarchy:
- @muonic/muon-0.0.2-alpha.2.tgz (Root Library)
- building-rollup-2.0.1.tgz
- :x: **terser-4.8.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package terser before 4.8.1, from 5.0.0 and before 5.14.2 are vulnerable to Regular Expression Denial of Service (ReDoS) due to insecure usage of regular expressions.
<p>Publish Date: 2022-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25858>CVE-2022-25858</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25858</a></p>
<p>Release Date: 2022-07-15</p>
<p>Fix Resolution: terser - 4.8.1,5.14.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-25858 (High) detected in terser-5.12.1.tgz, terser-4.8.0.tgz - autoclosed - ## CVE-2022-25858 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>terser-5.12.1.tgz</b>, <b>terser-4.8.0.tgz</b></p></summary>
<p>
<details><summary><b>terser-5.12.1.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit for ES6+</p>
<p>Library home page: <a href="https://registry.npmjs.org/terser/-/terser-5.12.1.tgz">https://registry.npmjs.org/terser/-/terser-5.12.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/terser/package.json</p>
<p>
Dependency Hierarchy:
- @muonic/muon-0.0.2-alpha.2.tgz (Root Library)
- building-rollup-2.0.1.tgz
- rollup-plugin-terser-7.0.2.tgz
- :x: **terser-5.12.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>terser-4.8.0.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit for ES6+</p>
<p>Library home page: <a href="https://registry.npmjs.org/terser/-/terser-4.8.0.tgz">https://registry.npmjs.org/terser/-/terser-4.8.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/babel-plugin-template-html-minifier/node_modules/terser/package.json,/node_modules/@open-wc/building-rollup/node_modules/terser/package.json,/node_modules/@open-wc/building-utils/node_modules/terser/package.json</p>
<p>
Dependency Hierarchy:
- @muonic/muon-0.0.2-alpha.2.tgz (Root Library)
- building-rollup-2.0.1.tgz
- :x: **terser-4.8.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package terser before 4.8.1, from 5.0.0 and before 5.14.2 are vulnerable to Regular Expression Denial of Service (ReDoS) due to insecure usage of regular expressions.
<p>Publish Date: 2022-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25858>CVE-2022-25858</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25858</a></p>
<p>Release Date: 2022-07-15</p>
<p>Fix Resolution: terser - 4.8.1,5.14.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in terser tgz terser tgz autoclosed cve high severity vulnerability vulnerable libraries terser tgz terser tgz terser tgz javascript parser mangler compressor and beautifier toolkit for library home page a href path to dependency file package json path to vulnerable library node modules terser package json dependency hierarchy muonic muon alpha tgz root library building rollup tgz rollup plugin terser tgz x terser tgz vulnerable library terser tgz javascript parser mangler compressor and beautifier toolkit for library home page a href path to dependency file package json path to vulnerable library node modules babel plugin template html minifier node modules terser package json node modules open wc building rollup node modules terser package json node modules open wc building utils node modules terser package json dependency hierarchy muonic muon alpha tgz root library building rollup tgz x terser tgz vulnerable library found in base branch main vulnerability details the package terser before from and before are vulnerable to regular expression denial of service redos due to insecure usage of regular expressions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution terser step up your open source security game with mend
| 0
|
8,490
| 11,647,041,824
|
IssuesEvent
|
2020-03-01 13:00:32
|
processing/processing
|
https://api.github.com/repos/processing/processing
|
closed
|
enum & interface after a bad line logs an error
|
preprocessor
|
## Description
Entering (just pasting) the following code throws a long error in the console:
```java
void f (int) {}
enum e {a}
interface i {}
```
does strange stuff.
## Expected Behavior
Some error should appear in the errors tab about the unnamed parameter `int`
## Current Behavior
This is written to the console:
```
Status ERROR: org.eclipse.jdt.core code=4 Exception occurred during compilation unit conversion:
----------------------------------- SOURCE BEGIN -------------------------------------
import processing.core.*;
import processing.data.*;
import processing.event.*;
import processing.opengl.*;
import java.util.HashMap;
import java.util.ArrayList;
import java.io.File;
import java.io.BufferedReader;
import java.io.PrintWriter;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.IOException;
public class sketch_180902a extends PApplet {
void f (int) {}
enum e {a}
interface i {}
}
----------------------------------- SOURCE END ------------------------------------- java.lang.IllegalArgumentException
```
`ctrl+click`ing on `f`, among other things, can also freeze the editor
## My Environment
* Processing version: 3.4, also happens on latest commit here
* OS: Linux Mint 18.3 Cinnamon
|
1.0
|
enum & interface after a bad line logs an error - ## Description
Entering (just pasting) the following code throws a long error in the console:
```java
void f (int) {}
enum e {a}
interface i {}
```
does strange stuff.
## Expected Behavior
Some error should appear in the errors tab about the unnamed parameter `int`
## Current Behavior
This is written to the console:
```
Status ERROR: org.eclipse.jdt.core code=4 Exception occurred during compilation unit conversion:
----------------------------------- SOURCE BEGIN -------------------------------------
import processing.core.*;
import processing.data.*;
import processing.event.*;
import processing.opengl.*;
import java.util.HashMap;
import java.util.ArrayList;
import java.io.File;
import java.io.BufferedReader;
import java.io.PrintWriter;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.IOException;
public class sketch_180902a extends PApplet {
void f (int) {}
enum e {a}
interface i {}
}
----------------------------------- SOURCE END ------------------------------------- java.lang.IllegalArgumentException
```
`ctrl+click`ing on `f`, among other things, can also freeze the editor
## My Environment
* Processing version: 3.4, also happens on latest commit here
* OS: Linux Mint 18.3 Cinnamon
|
process
|
enum interface after a bad line logs an error description entering just pasting the following code throws a long error in the console java void f int enum e a interface i does strange stuff expected behavior some error should appear in the errors tab about the unnamed parameter int current behavior this is written to the console status error org eclipse jdt core code exception occurred during compilation unit conversion source begin import processing core import processing data import processing event import processing opengl import java util hashmap import java util arraylist import java io file import java io bufferedreader import java io printwriter import java io inputstream import java io outputstream import java io ioexception public class sketch extends papplet void f int enum e a interface i source end java lang illegalargumentexception ctrl click ing on f among other things can also freeze the editor my environment processing version also happens on latest commit here os linux mint cinnamon
| 1
|
160,183
| 13,783,358,924
|
IssuesEvent
|
2020-10-08 19:05:29
|
ni/niveristand-scan-engine-ethercat-custom-device
|
https://api.github.com/repos/ni/niveristand-scan-engine-ethercat-custom-device
|
closed
|
Not correct "Add-On Installation Instructions"
|
bug documentation
|
**Describe the bug**
Repository don`t have "Scan Engine" folder.
**Desktop (please complete the following information):**
- OS: Windows 10
- Veristand: 2019
- LabVIEW: 2019
**Additional context**
If i put folder "Custom Device Source" to "<Public Documents>\National Instruments\NI VeriStand 20xx\Custom Devices" i get error:
LabVIEW: (Hex 0x7) File not found. The file might be in a different location or deleted. Use the command prompt or the file explorer to verify that the path is correct.
VI Path: C:\Users\Public\Documents\National Instruments\NI VeriStand 2019\Custom Devices\Scan Engine\Scan Engine - Configuration.llb\Scan Engine Initialization VI.vi
|
1.0
|
Not correct "Add-On Installation Instructions" - **Describe the bug**
Repository don`t have "Scan Engine" folder.
**Desktop (please complete the following information):**
- OS: Windows 10
- Veristand: 2019
- LabVIEW: 2019
**Additional context**
If i put folder "Custom Device Source" to "<Public Documents>\National Instruments\NI VeriStand 20xx\Custom Devices" i get error:
LabVIEW: (Hex 0x7) File not found. The file might be in a different location or deleted. Use the command prompt or the file explorer to verify that the path is correct.
VI Path: C:\Users\Public\Documents\National Instruments\NI VeriStand 2019\Custom Devices\Scan Engine\Scan Engine - Configuration.llb\Scan Engine Initialization VI.vi
|
non_process
|
not correct add on installation instructions describe the bug repository don t have scan engine folder desktop please complete the following information os windows veristand labview additional context if i put folder custom device source to national instruments ni veristand custom devices i get error labview hex file not found the file might be in a different location or deleted use the command prompt or the file explorer to verify that the path is correct vi path c users public documents national instruments ni veristand custom devices scan engine scan engine configuration llb scan engine initialization vi vi
| 0
|
317,529
| 9,666,374,045
|
IssuesEvent
|
2019-05-21 10:38:57
|
DigitalCampus/oppia-mobile-android
|
https://api.github.com/repos/DigitalCampus/oppia-mobile-android
|
closed
|
Points activity graph page - not showing description correctly
|
Medium priority bug
|
The description in the points list is showing "Description 1" etc rather than the actual description of the points (eg "for completing activity xxxx"), see:

|
1.0
|
Points activity graph page - not showing description correctly - The description in the points list is showing "Description 1" etc rather than the actual description of the points (eg "for completing activity xxxx"), see:

|
non_process
|
points activity graph page not showing description correctly the description in the points list is showing description etc rather than the actual description of the points eg for completing activity xxxx see
| 0
|
19,697
| 26,047,573,925
|
IssuesEvent
|
2022-12-22 15:37:49
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Should have a list of keywords - eg: how to specify Not Equal, Greater than etc
|
doc-enhancement devops/prod Pri2 devops-cicd-process/tech
|
Should have a list of keywords - eg: how to specify Not Equal, Greater than etc
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: d322215c-8025-4f21-0700-7dfa7dc5c46e
* Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Should have a list of keywords - eg: how to specify Not Equal, Greater than etc -
Should have a list of keywords - eg: how to specify Not Equal, Greater than etc
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: d322215c-8025-4f21-0700-7dfa7dc5c46e
* Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
should have a list of keywords eg how to specify not equal greater than etc should have a list of keywords eg how to specify not equal greater than etc document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
4,695
| 7,531,300,136
|
IssuesEvent
|
2018-04-15 03:44:48
|
pelias/schema
|
https://api.github.com/repos/pelias/schema
|
closed
|
single character street names
|
processed
|
Street names such as "K Rd" are currently not retrievable from the index due to the single grams not being stored.
We are now storing single digits, so we should also store single characters.
Alternatively we can wait until we investigate the FST improvements in elasticsearch@2+ which might allow us to start storing single grams for all names.
The simplest solution would be to add another filter similar to [prefixZeroToSingleDigitNumbers](https://github.com/pelias/schema/blob/master/settings.js#L178-L182).
|
1.0
|
single character street names - Street names such as "K Rd" are currently not retrievable from the index due to the single grams not being stored.
We are now storing single digits, so we should also store single characters.
Alternatively we can wait until we investigate the FST improvements in elasticsearch@2+ which might allow us to start storing single grams for all names.
The simplest solution would be to add another filter similar to [prefixZeroToSingleDigitNumbers](https://github.com/pelias/schema/blob/master/settings.js#L178-L182).
|
process
|
single character street names street names such as k rd are currently not retrievable from the index due to the single grams not being stored we are now storing single digits so we should also store single characters alternatively we can wait until we investigate the fst improvements in elasticsearch which might allow us to start storing single grams for all names the simplest solution would be to add another filter similar to
| 1
|
282,997
| 8,712,707,864
|
IssuesEvent
|
2018-12-06 23:11:46
|
FIDUCEO/FCDR_HIRS
|
https://api.github.com/repos/FIDUCEO/FCDR_HIRS
|
closed
|
Inconsistencies between radiance calculations, unexpected impact of changing ε
|
Priority: Very high bug
|
Differences between radiances calculated directly and through the symbolic measurement equation are too large when either emissivity or non-linearity are switched off individually. With a codebase that includes a15aeed and 9e67af7 (such that #336 should not be the cause anymore), with a breakpoint at
https://github.com/FIDUCEO/FCDR_HIRS/blob/9e67af790da5a9f541ad3b7ad69702d77a33509a/FCDR_HIRS/fcdr.py#L1722
I then prepare debugging with
```
import collections
locals().update({str(k): ureg.Quantity(typhon.math.common.promote_maximally((v.sel(calibrated_channel=4) if "calibrated_channel" in v.dims else v)), v.units).ravel()[0] for (k, v) in self._make_adict_dims_consistent_if_needed(collections.ChainMap(self._quantities, self._other_quantities), me.symbols["R_e"]).items()})
R_e_full = me.expressions[me.symbols["R_e"]]
R_e_simp = me.expression_Re_simplified
from .measurement_equation import expressions, symbols
R_e_full2 = me.recursive_substitution(R_e_full, expressions=me.expressions, stop_at={symbols["T_IWCT"], symbols["h"], symbols["c"], symbols["k_b"]})
```
and compare:
* between complete radiances, -0.058% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0}))); y=R_e; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.427965404115273e-12 watt / hertz / meter ** 2 / steradian
1.4287890253991072e-12 watt / hertz / meter ** 2 / steradian
-8.236212838342122e-16 watt / hertz / meter ** 2 / steradian
-0.057678%
```
* assuming linearity only, 6.40% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_2": 0}))); y=rad_wn_linear; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.5131952299227614e-12 watt / hertz / meter ** 2 / steradian
1.4163937849327702e-12 watt / hertz / meter ** 2 / steradian
9.680144498999117e-14 watt / hertz / meter ** 2 / steradian
6.397155%
```
* assuming no emissivity correction only, -7.6% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_3": 0}))); y=rad_wn_noεcorr; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.3041246534944941e-12 watt / hertz / meter ** 2 / steradian
1.4026476662769981e-12 watt / hertz / meter ** 2 / steradian
-9.852301278250398e-14 watt / hertz / meter ** 2 / steradian
-7.554724%
```
* neither emissivity correction nor nonlinearity, -0.064% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_2": 0, "a_3": 0}))); y=rad_wn_linearnoεcorr; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.3893544793019825e-12 watt / hertz / meter ** 2 / steradian
1.390252425810661e-12 watt / hertz / meter ** 2 / steradian
-8.97946508678595e-16 watt / hertz / meter ** 2 / steradian
-0.064630%
```
Differences in the order of 0.06% may be compatible with the difference in band correction or direct calculations, but there is clearly a serious bug when I get a 6% difference. I trust the symbolic version better than the direct version, so I suspect there is a bug in the latter.
|
1.0
|
Inconsistencies between radiance calculations, unexpected impact of changing ε - Differences between radiances calculated directly and through the symbolic measurement equation are too large when either emissivity or non-linearity are switched off individually. With a codebase that includes a15aeed and 9e67af7 (such that #336 should not be the cause anymore), with a breakpoint at
https://github.com/FIDUCEO/FCDR_HIRS/blob/9e67af790da5a9f541ad3b7ad69702d77a33509a/FCDR_HIRS/fcdr.py#L1722
I then prepare debugging with
```
import collections
locals().update({str(k): ureg.Quantity(typhon.math.common.promote_maximally((v.sel(calibrated_channel=4) if "calibrated_channel" in v.dims else v)), v.units).ravel()[0] for (k, v) in self._make_adict_dims_consistent_if_needed(collections.ChainMap(self._quantities, self._other_quantities), me.symbols["R_e"]).items()})
R_e_full = me.expressions[me.symbols["R_e"]]
R_e_simp = me.expression_Re_simplified
from .measurement_equation import expressions, symbols
R_e_full2 = me.recursive_substitution(R_e_full, expressions=me.expressions, stop_at={symbols["T_IWCT"], symbols["h"], symbols["c"], symbols["k_b"]})
```
and compare:
* between complete radiances, -0.058% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0}))); y=R_e; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.427965404115273e-12 watt / hertz / meter ** 2 / steradian
1.4287890253991072e-12 watt / hertz / meter ** 2 / steradian
-8.236212838342122e-16 watt / hertz / meter ** 2 / steradian
-0.057678%
```
* assuming linearity only, 6.40% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_2": 0}))); y=rad_wn_linear; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.5131952299227614e-12 watt / hertz / meter ** 2 / steradian
1.4163937849327702e-12 watt / hertz / meter ** 2 / steradian
9.680144498999117e-14 watt / hertz / meter ** 2 / steradian
6.397155%
```
* assuming no emissivity correction only, -7.6% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_3": 0}))); y=rad_wn_noεcorr; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.3041246534944941e-12 watt / hertz / meter ** 2 / steradian
1.4026476662769981e-12 watt / hertz / meter ** 2 / steradian
-9.852301278250398e-14 watt / hertz / meter ** 2 / steradian
-7.554724%
```
* neither emissivity correction nor nonlinearity, -0.064% difference:
```
In : x = eval(str(R_e_full.subs({"R_selfIWCT": 0, "R_selfs": 0, "O_RIWCT": 0, "O_Re": 0, "R_refl": 0, "a_2": 0, "a_3": 0}))); y=rad_wn_linearnoεcorr; print(x, y, x-y, "{:%}".format(((x-y)/x).m), sep="\n")
1.3893544793019825e-12 watt / hertz / meter ** 2 / steradian
1.390252425810661e-12 watt / hertz / meter ** 2 / steradian
-8.97946508678595e-16 watt / hertz / meter ** 2 / steradian
-0.064630%
```
Differences in the order of 0.06% may be compatible with the difference in band correction or direct calculations, but there is clearly a serious bug when I get a 6% difference. I trust the symbolic version better than the direct version, so I suspect there is a bug in the latter.
|
non_process
|
inconsistencies between radiance calculations unexpected impact of changing ε differences between radiances calculated directly and through the symbolic measurement equation are too large when either emissivity or non linearity are switched off individually with a codebase that includes and such that should not be the cause anymore with a breakpoint at i then prepare debugging with import collections locals update str k ureg quantity typhon math common promote maximally v sel calibrated channel if calibrated channel in v dims else v v units ravel for k v in self make adict dims consistent if needed collections chainmap self quantities self other quantities me symbols items r e full me expressions r e simp me expression re simplified from measurement equation import expressions symbols r e me recursive substitution r e full expressions me expressions stop at symbols symbols symbols symbols and compare between complete radiances difference in x eval str r e full subs r selfiwct r selfs o riwct o re r refl y r e print x y x y format x y x m sep n watt hertz meter steradian watt hertz meter steradian watt hertz meter steradian assuming linearity only difference in x eval str r e full subs r selfiwct r selfs o riwct o re r refl a y rad wn linear print x y x y format x y x m sep n watt hertz meter steradian watt hertz meter steradian watt hertz meter steradian assuming no emissivity correction only difference in x eval str r e full subs r selfiwct r selfs o riwct o re r refl a y rad wn noεcorr print x y x y format x y x m sep n watt hertz meter steradian watt hertz meter steradian watt hertz meter steradian neither emissivity correction nor nonlinearity difference in x eval str r e full subs r selfiwct r selfs o riwct o re r refl a a y rad wn linearnoεcorr print x y x y format x y x m sep n watt hertz meter steradian watt hertz meter steradian watt hertz meter steradian differences in the order of may be compatible with the difference in band correction or direct calculations but there is clearly a serious bug when i get a difference i trust the symbolic version better than the direct version so i suspect there is a bug in the latter
| 0
|
3,337
| 6,472,857,751
|
IssuesEvent
|
2017-08-17 14:47:37
|
uvacw/inca
|
https://api.github.com/repos/uvacw/inca
|
closed
|
Sentiment analysis
|
2017_CorpCom_PG_brainstorm CorpCom PersCom PROCESSORS
|
PersCom: Hanneke: Automatic coding/help with coding this content. Things that could be coded automatically are likes/comments/tone of comments
|
1.0
|
Sentiment analysis - PersCom: Hanneke: Automatic coding/help with coding this content. Things that could be coded automatically are likes/comments/tone of comments
|
process
|
sentiment analysis perscom hanneke automatic coding help with coding this content things that could be coded automatically are likes comments tone of comments
| 1
|
2,974
| 10,708,108,267
|
IssuesEvent
|
2019-10-24 18:56:40
|
18F/cg-product
|
https://api.github.com/repos/18F/cg-product
|
closed
|
Fix Kubernetes pod not running false alarms
|
contractor-3-maintainability operations
|
Starting in August or September, we started getting alerts in Prometheus for "Kubernetes pod not running" that seem to be for pods that either are running, or have been deleted and replaced. These alerts seem to never clear, making it difficult to tell when there are real issues with kubernetes.
[Slack thread about this issue](https://gsa-tts.slack.com/archives/C0ENP71UG/p1565960918162500?thread_ts=1565960762.161600&cid=C0ENP71UG) :lock:
## Notes
- We currently manually clear these alerts by deleting the pods that continuously report this, if the alert doesn't clear itself
- `kubectl get pods -a | grep Evicted | awk '{print $1}' | xargs kubectl delete pod `
- The issue doesn't seem to be with Prometheus
## Next steps
- Research whether this is a known issue with Kubernetes
- Determine a path to fix this issue
|
True
|
Fix Kubernetes pod not running false alarms - Starting in August or September, we started getting alerts in Prometheus for "Kubernetes pod not running" that seem to be for pods that either are running, or have been deleted and replaced. These alerts seem to never clear, making it difficult to tell when there are real issues with kubernetes.
[Slack thread about this issue](https://gsa-tts.slack.com/archives/C0ENP71UG/p1565960918162500?thread_ts=1565960762.161600&cid=C0ENP71UG) :lock:
## Notes
- We currently manually clear these alerts by deleting the pods that continuously report this, if the alert doesn't clear itself
- `kubectl get pods -a | grep Evicted | awk '{print $1}' | xargs kubectl delete pod `
- The issue doesn't seem to be with Prometheus
## Next steps
- Research whether this is a known issue with Kubernetes
- Determine a path to fix this issue
|
non_process
|
fix kubernetes pod not running false alarms starting in august or september we started getting alerts in prometheus for kubernetes pod not running that seem to be for pods that either are running or have been deleted and replaced these alerts seem to never clear making it difficult to tell when there are real issues with kubernetes lock notes we currently manually clear these alerts by deleting the pods that continuously report this if the alert doesn t clear itself kubectl get pods a grep evicted awk print xargs kubectl delete pod the issue doesn t seem to be with prometheus next steps research whether this is a known issue with kubernetes determine a path to fix this issue
| 0
|
46,944
| 24,794,622,649
|
IssuesEvent
|
2022-10-24 16:10:55
|
iree-org/iree
|
https://api.github.com/repos/iree-org/iree
|
closed
|
Add `hal_inline` dialect/module for tiny environments.
|
runtime performance ⚡
|
For environments where the execution model is known to be exclusively local and inline (embedded systems) we can have a paired down HAL that pretty much only contains executable support. The idea is to still use HAL executable translation in the compiler but lowering the stream dialect to a new lightweight dialect that pretty much only manages executables and dispatches. Most of the local/ implementation of the executable loader and the loaders themselves have no dependencies on command buffers, allocators, buffers, or devices and can be cleanly pulled into the module without bringing in the bulk of the HAL API.
It's debatable whether the allocator/buffer stuff should be included - that would allow the coming allocator types to be reused but at the cost of additional non-user-controllable overheads. Since the local executables take byte spans all buffers could just be `iree_vm_buffer_t` which is already compiled in and available for use - and since they take a custom `iree_allocator_t` it's still possible for hosting applications to manage memory however they want.
Executables themselves will still be injected on the module when created same as today, allowing for dynamic, static, vmvx, etc executables to be run this way. This allows us to separate the execution model from the deployment model at the cost of a few vtables.
Outline:
* [x] Add `hal_inline` dialect with basic ops:
* [x] `hal_inline.executable.create`
* [x] `hal_inline.executable.dispatch`
* [x] `hal_inline.executable_layout.create`? (still need this to reuse loaders/libraries)
* [x] Add `--execution-mode=` iree-compile flag to switch between `hal-async` and `hal-inline` (or w/e)
* [x] Have a new `iree-hal-inline-transformation-pipeline` that still performs interface materialization and executable translation but otherwise lowers `stream` itself
* [x] Add `iree/modules/hal_inline` runtime module that links directly against the `iree/hal/local/` libraries
* [x] Build a runner tool that uses the inline module (or make iree-run-module/etc always support it with a flag)
(could also call this the `inline` dialect or something - it's still technically a HAL though as the executables being called are abstracted across hardware - can have CPU/FPGA/DSP/etc)
|
True
|
Add `hal_inline` dialect/module for tiny environments. - For environments where the execution model is known to be exclusively local and inline (embedded systems) we can have a paired down HAL that pretty much only contains executable support. The idea is to still use HAL executable translation in the compiler but lowering the stream dialect to a new lightweight dialect that pretty much only manages executables and dispatches. Most of the local/ implementation of the executable loader and the loaders themselves have no dependencies on command buffers, allocators, buffers, or devices and can be cleanly pulled into the module without bringing in the bulk of the HAL API.
It's debatable whether the allocator/buffer stuff should be included - that would allow the coming allocator types to be reused but at the cost of additional non-user-controllable overheads. Since the local executables take byte spans all buffers could just be `iree_vm_buffer_t` which is already compiled in and available for use - and since they take a custom `iree_allocator_t` it's still possible for hosting applications to manage memory however they want.
Executables themselves will still be injected on the module when created same as today, allowing for dynamic, static, vmvx, etc executables to be run this way. This allows us to separate the execution model from the deployment model at the cost of a few vtables.
Outline:
* [x] Add `hal_inline` dialect with basic ops:
* [x] `hal_inline.executable.create`
* [x] `hal_inline.executable.dispatch`
* [x] `hal_inline.executable_layout.create`? (still need this to reuse loaders/libraries)
* [x] Add `--execution-mode=` iree-compile flag to switch between `hal-async` and `hal-inline` (or w/e)
* [x] Have a new `iree-hal-inline-transformation-pipeline` that still performs interface materialization and executable translation but otherwise lowers `stream` itself
* [x] Add `iree/modules/hal_inline` runtime module that links directly against the `iree/hal/local/` libraries
* [x] Build a runner tool that uses the inline module (or make iree-run-module/etc always support it with a flag)
(could also call this the `inline` dialect or something - it's still technically a HAL though as the executables being called are abstracted across hardware - can have CPU/FPGA/DSP/etc)
|
non_process
|
add hal inline dialect module for tiny environments for environments where the execution model is known to be exclusively local and inline embedded systems we can have a paired down hal that pretty much only contains executable support the idea is to still use hal executable translation in the compiler but lowering the stream dialect to a new lightweight dialect that pretty much only manages executables and dispatches most of the local implementation of the executable loader and the loaders themselves have no dependencies on command buffers allocators buffers or devices and can be cleanly pulled into the module without bringing in the bulk of the hal api it s debatable whether the allocator buffer stuff should be included that would allow the coming allocator types to be reused but at the cost of additional non user controllable overheads since the local executables take byte spans all buffers could just be iree vm buffer t which is already compiled in and available for use and since they take a custom iree allocator t it s still possible for hosting applications to manage memory however they want executables themselves will still be injected on the module when created same as today allowing for dynamic static vmvx etc executables to be run this way this allows us to separate the execution model from the deployment model at the cost of a few vtables outline add hal inline dialect with basic ops hal inline executable create hal inline executable dispatch hal inline executable layout create still need this to reuse loaders libraries add execution mode iree compile flag to switch between hal async and hal inline or w e have a new iree hal inline transformation pipeline that still performs interface materialization and executable translation but otherwise lowers stream itself add iree modules hal inline runtime module that links directly against the iree hal local libraries build a runner tool that uses the inline module or make iree run module etc always support it with a flag could also call this the inline dialect or something it s still technically a hal though as the executables being called are abstracted across hardware can have cpu fpga dsp etc
| 0
|
600,891
| 18,361,298,959
|
IssuesEvent
|
2021-10-09 08:48:36
|
AY2122S1-CS2103T-W11-2/tp
|
https://api.github.com/repos/AY2122S1-CS2103T-W11-2/tp
|
closed
|
Adding and removing fields from Staff class
|
type.Task priority.High severity.High
|
To remove: Tags
To add: Role, Status, Index, Schedule
|
1.0
|
Adding and removing fields from Staff class - To remove: Tags
To add: Role, Status, Index, Schedule
|
non_process
|
adding and removing fields from staff class to remove tags to add role status index schedule
| 0
|
52,679
| 7,780,068,631
|
IssuesEvent
|
2018-06-05 18:49:48
|
openmpf/openmpf
|
https://api.github.com/repos/openmpf/openmpf
|
closed
|
Update online docs with Darknet component
|
documentation release 2.1.0
|
The Darknet component needs to be added to the table on [this page](https://openmpf.github.io/docs/site/) and the bulleted list on [this page](https://openmpf.github.io).
|
1.0
|
Update online docs with Darknet component - The Darknet component needs to be added to the table on [this page](https://openmpf.github.io/docs/site/) and the bulleted list on [this page](https://openmpf.github.io).
|
non_process
|
update online docs with darknet component the darknet component needs to be added to the table on and the bulleted list on
| 0
|
17,834
| 23,773,303,651
|
IssuesEvent
|
2022-09-01 18:21:05
|
Azure/azure-sdk-tools
|
https://api.github.com/repos/Azure/azure-sdk-tools
|
opened
|
Onboarding Step: Cadl
|
Cadl Engagement Experience WS: Process Tools & Automation
|
The purpose of this Epic is to define the gaps on the onboarding process inside the Azure SDK Release process affected by Cadl.
Some general parts on the step that need to be accounted for:
- New Service Setup:
- Get Started with Cadl documentation
- Necessary permissions for tooling around Cadl
- Existing Service new to Cadl
- Informational meeting: It should include and prepare service teams
- Assign service buddy
|
1.0
|
Onboarding Step: Cadl - The purpose of this Epic is to define the gaps on the onboarding process inside the Azure SDK Release process affected by Cadl.
Some general parts on the step that need to be accounted for:
- New Service Setup:
- Get Started with Cadl documentation
- Necessary permissions for tooling around Cadl
- Existing Service new to Cadl
- Informational meeting: It should include and prepare service teams
- Assign service buddy
|
process
|
onboarding step cadl the purpose of this epic is to define the gaps on the onboarding process inside the azure sdk release process affected by cadl some general parts on the step that need to be accounted for new service setup get started with cadl documentation necessary permissions for tooling around cadl existing service new to cadl informational meeting it should include and prepare service teams assign service buddy
| 1
|
2,625
| 5,399,567,119
|
IssuesEvent
|
2017-02-27 19:46:31
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Wrong format for BigQuery timestamp in filters
|
Bug Database/BigQuery Parameters/Variables Query Processor
|
- Chrome 55
- Windows 7
- BigQuery
- Metabase 0.21.1
- Metabase hosting environment: Google App Engine
- Metabase internal database: Google SQL
When using a custom SQL request with a date type filter :
```
SELECT foo
FROM bar
WHERE timestamp >= {{startdate}}
```
I have the error :
```
{"code" 400,
"errors"
[{"domain" "global", "location" "query", "locationType" "other", "message" "14.16 - 14.27: Could not parse '2016-12-01' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]]", "reason" "invalidQuery"}],
"message" "14.16 - 14.27: Could not parse '2016-12-01' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]]"}}
```
|
1.0
|
Wrong format for BigQuery timestamp in filters - - Chrome 55
- Windows 7
- BigQuery
- Metabase 0.21.1
- Metabase hosting environment: Google App Engine
- Metabase internal database: Google SQL
When using a custom SQL request with a date type filter :
```
SELECT foo
FROM bar
WHERE timestamp >= {{startdate}}
```
I have the error :
```
{"code" 400,
"errors"
[{"domain" "global", "location" "query", "locationType" "other", "message" "14.16 - 14.27: Could not parse '2016-12-01' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]]", "reason" "invalidQuery"}],
"message" "14.16 - 14.27: Could not parse '2016-12-01' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]]"}}
```
|
process
|
wrong format for bigquery timestamp in filters chrome windows bigquery metabase metabase hosting environment google app engine metabase internal database google sql when using a custom sql request with a date type filter select foo from bar where timestamp startdate i have the error code errors reason invalidquery message could not parse as a timestamp required format is yyyy mm dd hh mm
| 1
|
16,189
| 20,626,811,925
|
IssuesEvent
|
2022-03-07 23:45:53
|
bartvanosnabrugge/summaWorkflow_public
|
https://api.github.com/repos/bartvanosnabrugge/summaWorkflow_public
|
opened
|
4b_remapping/2_forcing/3_temperature_lapsing_and_datastep
|
function conversion model agnostic processing
|
- [ ] copy/ change and adjust code to work as python function
- [ ] make sure to add docstring in the sphinx format (see for example util/util.py - read_summa_workflow_control_file or data_specific_processing/merit.py)
- [ ] Add to the description parts of the readme.rst file that come with the summa-cwarhm script that are relevant to understand what the function does precisely.
- [ ] point test_bow_at_banff.py to use the function instead of the wrapper
|
1.0
|
4b_remapping/2_forcing/3_temperature_lapsing_and_datastep - - [ ] copy/ change and adjust code to work as python function
- [ ] make sure to add docstring in the sphinx format (see for example util/util.py - read_summa_workflow_control_file or data_specific_processing/merit.py)
- [ ] Add to the description parts of the readme.rst file that come with the summa-cwarhm script that are relevant to understand what the function does precisely.
- [ ] point test_bow_at_banff.py to use the function instead of the wrapper
|
process
|
remapping forcing temperature lapsing and datastep copy change and adjust code to work as python function make sure to add docstring in the sphinx format see for example util util py read summa workflow control file or data specific processing merit py add to the description parts of the readme rst file that come with the summa cwarhm script that are relevant to understand what the function does precisely point test bow at banff py to use the function instead of the wrapper
| 1
|
9,959
| 12,991,950,199
|
IssuesEvent
|
2020-07-23 05:28:56
|
GoogleCloudPlatform/stackdriver-sandbox
|
https://api.github.com/repos/GoogleCloudPlatform/stackdriver-sandbox
|
closed
|
[docs] Add section to User Guide with ./destroy.sh instructions
|
priority: p2 type: process
|
In order for users to complete their use of Stackdriver Sandbox without incurring additional billing, it is important for the users to run the ./destroy.sh script at the end of their use.
However, the User Guide does not currently have a section instructing users to do so. A section near the end of the User Guide (before the OpenTelemetry section) could be added instructing users to do so.
|
1.0
|
[docs] Add section to User Guide with ./destroy.sh instructions - In order for users to complete their use of Stackdriver Sandbox without incurring additional billing, it is important for the users to run the ./destroy.sh script at the end of their use.
However, the User Guide does not currently have a section instructing users to do so. A section near the end of the User Guide (before the OpenTelemetry section) could be added instructing users to do so.
|
process
|
add section to user guide with destroy sh instructions in order for users to complete their use of stackdriver sandbox without incurring additional billing it is important for the users to run the destroy sh script at the end of their use however the user guide does not currently have a section instructing users to do so a section near the end of the user guide before the opentelemetry section could be added instructing users to do so
| 1
|
118,153
| 25,262,564,247
|
IssuesEvent
|
2022-11-16 00:28:22
|
vektor-inc/vk-blocks-pro
|
https://api.github.com/repos/vektor-inc/vk-blocks-pro
|
closed
|
register_settingのパラメーター取得方法を修正する
|
Code Quality
|
https://developer.wordpress.org/reference/functions/register_setting/
options_schemeでget_properties、get_defaultsしていた方法を廃止する
書式設定( #1057 #1367 )やブロックマネージャー(#1047 )を作っていてschemeをまとめてpropertiesやdefaultsを設定するのは配列の中に配列があった時にどのように対応できない問題が出てきたので、propertiesやdefaultsをそれぞれ定義しようと思います
やや冗長な書き方になるかもしれないですがpropertiesやdefaultsを柔軟に定義できるようになるので結果的には分かりやすくなると思います
何か意見や不明点等ありましたらお気軽にいただければ
https://github.com/vektor-inc/vk-blocks-pro/pull/1300
|
1.0
|
register_settingのパラメーター取得方法を修正する - https://developer.wordpress.org/reference/functions/register_setting/
options_schemeでget_properties、get_defaultsしていた方法を廃止する
書式設定( #1057 #1367 )やブロックマネージャー(#1047 )を作っていてschemeをまとめてpropertiesやdefaultsを設定するのは配列の中に配列があった時にどのように対応できない問題が出てきたので、propertiesやdefaultsをそれぞれ定義しようと思います
やや冗長な書き方になるかもしれないですがpropertiesやdefaultsを柔軟に定義できるようになるので結果的には分かりやすくなると思います
何か意見や不明点等ありましたらお気軽にいただければ
https://github.com/vektor-inc/vk-blocks-pro/pull/1300
|
non_process
|
register settingのパラメーター取得方法を修正する options schemeでget properties、get defaultsしていた方法を廃止する 書式設定 やブロックマネージャー を作っていてschemeをまとめてpropertiesやdefaultsを設定するのは配列の中に配列があった時にどのように対応できない問題が出てきたので、propertiesやdefaultsをそれぞれ定義しようと思います やや冗長な書き方になるかもしれないですがpropertiesやdefaultsを柔軟に定義できるようになるので結果的には分かりやすくなると思います 何か意見や不明点等ありましたらお気軽にいただければ
| 0
|
431,268
| 30,225,651,716
|
IssuesEvent
|
2023-07-06 00:05:57
|
Requisitos-de-Software/2023.1-Simplenote
|
https://api.github.com/repos/Requisitos-de-Software/2023.1-Simplenote
|
closed
|
Corrigir documentos do projeto com base na avaliação do grupo 4
|
documentation Revisão Correção Escrita
|
## Descrição
Corrigir documentos do projeto com base na avaliação do grupo 4
## Tarefas
- Planejamento
- [x] Rich picture
- [x] App
- [x] Cronograma
- [x] Cronograma realizado
- [x] Ferramentas
- [x] Metodologias
- Elicitação
- [x] Introspecção
- [x] Brainstorming
- [x] Entrevista
- [x] Glossário
- [x] Storytelling
- [x] Questionário
- [x] Personas
- [x] FTF
- [x] TLE
- [x] Moscow
- Modelagem
- [x] NFR
- [x] Léxicos
- [x] Cenários
- [x] Especificação Suplementar
- [x] Casos de Uso
- [x] Backlog
- [x] Histórias de usuário
|
1.0
|
Corrigir documentos do projeto com base na avaliação do grupo 4 - ## Descrição
Corrigir documentos do projeto com base na avaliação do grupo 4
## Tarefas
- Planejamento
- [x] Rich picture
- [x] App
- [x] Cronograma
- [x] Cronograma realizado
- [x] Ferramentas
- [x] Metodologias
- Elicitação
- [x] Introspecção
- [x] Brainstorming
- [x] Entrevista
- [x] Glossário
- [x] Storytelling
- [x] Questionário
- [x] Personas
- [x] FTF
- [x] TLE
- [x] Moscow
- Modelagem
- [x] NFR
- [x] Léxicos
- [x] Cenários
- [x] Especificação Suplementar
- [x] Casos de Uso
- [x] Backlog
- [x] Histórias de usuário
|
non_process
|
corrigir documentos do projeto com base na avaliação do grupo descrição corrigir documentos do projeto com base na avaliação do grupo tarefas planejamento rich picture app cronograma cronograma realizado ferramentas metodologias elicitação introspecção brainstorming entrevista glossário storytelling questionário personas ftf tle moscow modelagem nfr léxicos cenários especificação suplementar casos de uso backlog histórias de usuário
| 0
|
1,335
| 3,899,642,871
|
IssuesEvent
|
2016-04-17 21:19:58
|
kerubistan/kerub
|
https://api.github.com/repos/kerubistan/kerub
|
closed
|
allow user to upload disk image
|
component:data processing enhancement priority: normal
|
user should be able to upload a disk image as a file, the file must be stored on a host
|
1.0
|
allow user to upload disk image - user should be able to upload a disk image as a file, the file must be stored on a host
|
process
|
allow user to upload disk image user should be able to upload a disk image as a file the file must be stored on a host
| 1
|
22,226
| 30,775,503,814
|
IssuesEvent
|
2023-07-31 06:02:13
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
electron-mocha 12.0.1 has 1 guarddog issues
|
npm-silent-process-execution
|
```{"npm-silent-process-execution":[{"code":" const child = spawn(process.execPath, ['cleanup.js', userData], {\n detached: true,\n stdio: 'ignore',\n env: { ELECTRON_RUN_AS_NODE: 1 },\n cwd: __dirname\n })","location":"package/lib/main.js:53","message":"This package is silently executing another executable"}]}```
|
1.0
|
electron-mocha 12.0.1 has 1 guarddog issues - ```{"npm-silent-process-execution":[{"code":" const child = spawn(process.execPath, ['cleanup.js', userData], {\n detached: true,\n stdio: 'ignore',\n env: { ELECTRON_RUN_AS_NODE: 1 },\n cwd: __dirname\n })","location":"package/lib/main.js:53","message":"This package is silently executing another executable"}]}```
|
process
|
electron mocha has guarddog issues npm silent process execution n detached true n stdio ignore n env electron run as node n cwd dirname n location package lib main js message this package is silently executing another executable
| 1
|
11,891
| 14,686,116,231
|
IssuesEvent
|
2021-01-01 13:17:10
|
yuta252/startlens_web_backend
|
https://api.github.com/repos/yuta252/startlens_web_backend
|
closed
|
JWTによるユーザー認証機能及びユーザー機能の追加
|
dev process
|
## 概要
JWTトークンを利用したユーザー認証を実装することでフロントエンドからAPIを利用したときにユーザーのログイン状態を管理できるようにする。ユーザーの新規登録及びログイン処理に必要な処理を実装する。
## 変更点
---
- [x] TokensControllerにおいてログイン時にJWTトークンを生成しレスポンスを返すように実装
- [x] UsersControllerにおいて新規登録時にユーザーとそれに紐づくプロフィールを作成するように実装
- [x] JSON レスポンスをデフォルトのsnake_caseからcamelCaseに変更
- [x] AuthorizationヘッダーのJWTトークンから現在ログインしているユーザーを取得するcurrent_userメソッドの作成
- [x] ActiveModelSeriarizerを利用してレスポンスをテンプレート化
- [x] User model, Profile modelを作成しmigration生成
- [x] バリデーションの日本語化のためi18n gemを活用
## 課題
---
- [ ] Userが作成されるも途中でエラーが発生するとProfileモデルが作成されない処理が発生するためトランザクション処理を検討する
- [ ]
## 参照
---
- [API on Rails 6 written by Alexandre Rousseau](https://leanpub.com/apionrails6)
## 備考
---
|
1.0
|
JWTによるユーザー認証機能及びユーザー機能の追加 - ## 概要
JWTトークンを利用したユーザー認証を実装することでフロントエンドからAPIを利用したときにユーザーのログイン状態を管理できるようにする。ユーザーの新規登録及びログイン処理に必要な処理を実装する。
## 変更点
---
- [x] TokensControllerにおいてログイン時にJWTトークンを生成しレスポンスを返すように実装
- [x] UsersControllerにおいて新規登録時にユーザーとそれに紐づくプロフィールを作成するように実装
- [x] JSON レスポンスをデフォルトのsnake_caseからcamelCaseに変更
- [x] AuthorizationヘッダーのJWTトークンから現在ログインしているユーザーを取得するcurrent_userメソッドの作成
- [x] ActiveModelSeriarizerを利用してレスポンスをテンプレート化
- [x] User model, Profile modelを作成しmigration生成
- [x] バリデーションの日本語化のためi18n gemを活用
## 課題
---
- [ ] Userが作成されるも途中でエラーが発生するとProfileモデルが作成されない処理が発生するためトランザクション処理を検討する
- [ ]
## 参照
---
- [API on Rails 6 written by Alexandre Rousseau](https://leanpub.com/apionrails6)
## 備考
---
|
process
|
jwtによるユーザー認証機能及びユーザー機能の追加 概要 jwtトークンを利用したユーザー認証を実装することでフロントエンドからapiを利用したときにユーザーのログイン状態を管理できるようにする。ユーザーの新規登録及びログイン処理に必要な処理を実装する。 変更点 tokenscontrollerにおいてログイン時にjwtトークンを生成しレスポンスを返すように実装 userscontrollerにおいて新規登録時にユーザーとそれに紐づくプロフィールを作成するように実装 json レスポンスをデフォルトのsnake caseからcamelcaseに変更 authorizationヘッダーのjwtトークンから現在ログインしているユーザーを取得するcurrent userメソッドの作成 activemodelseriarizerを利用してレスポンスをテンプレート化 user model profile modelを作成しmigration生成 gemを活用 課題 userが作成されるも途中でエラーが発生するとprofileモデルが作成されない処理が発生するためトランザクション処理を検討する 参照 備考
| 1
|
225,524
| 24,855,885,702
|
IssuesEvent
|
2022-10-27 02:12:10
|
hackforla/ops
|
https://api.github.com/repos/hackforla/ops
|
opened
|
Split the project-leads user group into developer-group, ops-group, and project-leads group
|
feature: AWS IAM feature: administrative feature: security
|
### Overview
AWS best practice is to give minimum permission to users [only what's needed for the task]. Some users currently have more than that.
### Action Items
- [ ] Create a developer-group
- [ ] Create an ops-group
- [ ] Add relevant permissions to developer-group and ops-group
- [ ] Reassign groups to users. Project leads should be assigned to project-leads group. Developers to developer-group. Ops-group to ops-group.
This issue was brought up in 10/26/2022 ops meeting. We spoke about bringing the discussion here because the scope of this issue is large.
### Resources/Instructions
["Apply least-privilege permissions - When you set permissions with IAM policies, grant only the permissions required to perform a task. You do this by defining the actions that can be taken on specific resources under specific conditions, also known as least-privilege permissions. You might start with broad permissions while you explore the permissions that are required for your workload or use case. As your use case matures, you can work to reduce the permissions that you grant to work toward least privilege"](https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html)
|
True
|
Split the project-leads user group into developer-group, ops-group, and project-leads group - ### Overview
AWS best practice is to give minimum permission to users [only what's needed for the task]. Some users currently have more than that.
### Action Items
- [ ] Create a developer-group
- [ ] Create an ops-group
- [ ] Add relevant permissions to developer-group and ops-group
- [ ] Reassign groups to users. Project leads should be assigned to project-leads group. Developers to developer-group. Ops-group to ops-group.
This issue was brought up in 10/26/2022 ops meeting. We spoke about bringing the discussion here because the scope of this issue is large.
### Resources/Instructions
["Apply least-privilege permissions - When you set permissions with IAM policies, grant only the permissions required to perform a task. You do this by defining the actions that can be taken on specific resources under specific conditions, also known as least-privilege permissions. You might start with broad permissions while you explore the permissions that are required for your workload or use case. As your use case matures, you can work to reduce the permissions that you grant to work toward least privilege"](https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html)
|
non_process
|
split the project leads user group into developer group ops group and project leads group overview aws best practice is to give minimum permission to users some users currently have more than that action items create a developer group create an ops group add relevant permissions to developer group and ops group reassign groups to users project leads should be assigned to project leads group developers to developer group ops group to ops group this issue was brought up in ops meeting we spoke about bringing the discussion here because the scope of this issue is large resources instructions
| 0
|
20,819
| 27,578,844,541
|
IssuesEvent
|
2023-03-08 14:54:02
|
ukri-excalibur/excalibur-tests
|
https://api.github.com/repos/ukri-excalibur/excalibur-tests
|
opened
|
Investigate what other environment variables are needed for postprocessing
|
postprocessing
|
... in addition to the fields in the perflogs - things like compilers and their versions, MPI implementations, the benchmark app version, etc.
- Could any of them be fished out of the perflog fields?
- Should we just ask users to add whatever they need logged as an environment variable?
|
1.0
|
Investigate what other environment variables are needed for postprocessing - ... in addition to the fields in the perflogs - things like compilers and their versions, MPI implementations, the benchmark app version, etc.
- Could any of them be fished out of the perflog fields?
- Should we just ask users to add whatever they need logged as an environment variable?
|
process
|
investigate what other environment variables are needed for postprocessing in addition to the fields in the perflogs things like compilers and their versions mpi implementations the benchmark app version etc could any of them be fished out of the perflog fields should we just ask users to add whatever they need logged as an environment variable
| 1
|
97,344
| 8,653,058,953
|
IssuesEvent
|
2018-11-27 09:51:17
|
humera987/FXLabs-Test-Automation
|
https://api.github.com/repos/humera987/FXLabs-Test-Automation
|
closed
|
new tested 27 : ApiV1IssuesProjectIdIdGetQueryParamPagesizeDdos
|
new tested 27
|
Project : new tested 27
Job : UAT
Env : UAT
Region : US_WEST
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=NjEyZGM1N2EtOWY5Yy00NGYzLThkOGEtNTFiN2FmNWYxZjg5; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Tue, 27 Nov 2018 09:49:49 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/issues/project-id/GCjWPxxc?pageSize=1001&status=GCjWPxxc
Request :
Response :
{
"timestamp" : "2018-11-27T09:49:49.673+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/issues/project-id/GCjWPxxc"
}
Logs :
Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot ---
|
1.0
|
new tested 27 : ApiV1IssuesProjectIdIdGetQueryParamPagesizeDdos - Project : new tested 27
Job : UAT
Env : UAT
Region : US_WEST
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=NjEyZGM1N2EtOWY5Yy00NGYzLThkOGEtNTFiN2FmNWYxZjg5; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Tue, 27 Nov 2018 09:49:49 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/issues/project-id/GCjWPxxc?pageSize=1001&status=GCjWPxxc
Request :
Response :
{
"timestamp" : "2018-11-27T09:49:49.673+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/issues/project-id/GCjWPxxc"
}
Logs :
Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot ---
|
non_process
|
new tested project new tested job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api issues project id gcjwpxxc logs assertion resolved to result assertion resolved to result fx bot
| 0
|
14,996
| 18,676,778,517
|
IssuesEvent
|
2021-10-31 17:43:14
|
slynch8/10x
|
https://api.github.com/repos/slynch8/10x
|
closed
|
Preprocessor doesn't recognise "true"
|
bug Priority 3 trivial preprocessor
|

If I use this instead:
`#define FOO 1`
Then the correct branch is used.
|
1.0
|
Preprocessor doesn't recognise "true" - 
If I use this instead:
`#define FOO 1`
Then the correct branch is used.
|
process
|
preprocessor doesn t recognise true if i use this instead define foo then the correct branch is used
| 1
|
6,662
| 9,782,046,282
|
IssuesEvent
|
2019-06-07 21:46:29
|
googleapis/google-cloud-java
|
https://api.github.com/repos/googleapis/google-cloud-java
|
closed
|
Dependency convergence errors in datastore
|
type: process
|
It's not immediately obvious to me why I see this and the checks in this project's own pom.xml don't; but I definitely see this in a project that does nothing but import the datastore dependency from the google-cloud-java BOM and check dependency convergence:
```
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Dependency Tests for Google Cloud Libraries 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce) @ dependencytest ---
[WARNING]
Dependency convergence error for com.google.code.findbugs:jsr305:1.3.9 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:api-common:1.5.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:gax:1.23.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.code.findbugs:jsr305:3.0.0
[WARNING]
Dependency convergence error for com.google.protobuf:protobuf-java:3.5.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.protobuf:protobuf-java-util:3.5.1
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-common-protos:1.9.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-iam-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.api.grpc:proto-google-cloud-datastore-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.protobuf:protobuf-java:2.4.1
[WARNING]
Dependency convergence error for com.google.errorprone:error_prone_annotations:2.1.3 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.errorprone:error_prone_annotations:2.1.3
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.errorprone:error_prone_annotations:2.1.2
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client-jackson2:1.19.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client-jackson2:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client-jackson:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-jackson:1.20.0
[WARNING]
Dependency convergence error for com.google.api-client:google-api-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.api-client:google-api-client:1.20.0
[WARNING]
Dependency convergence error for io.opencensus:opencensus-api:0.11.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-api:0.11.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-contrib-grpc-metrics:0.11.0
+-io.opencensus:opencensus-api:0.11.0
[WARNING]
Dependency convergence error for com.google.oauth-client:google-oauth-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.oauth-client:google-oauth-client:1.20.0
[WARNING]
Dependency convergence error for io.grpc:grpc-context:1.9.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-io.grpc:grpc-context:1.9.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.grpc:grpc-context:1.10.1
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-appengine:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.http-client:google-http-client:1.20.0
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for com.google.code.findbugs:jsr305:1.3.9 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:api-common:1.5.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:gax:1.23.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.code.findbugs:jsr305:3.0.0
,
Dependency convergence error for com.google.protobuf:protobuf-java:3.5.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.protobuf:protobuf-java-util:3.5.1
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-common-protos:1.9.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-iam-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.api.grpc:proto-google-cloud-datastore-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.protobuf:protobuf-java:2.4.1
,
Dependency convergence error for com.google.errorprone:error_prone_annotations:2.1.3 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.errorprone:error_prone_annotations:2.1.3
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.errorprone:error_prone_annotations:2.1.2
,
Dependency convergence error for com.google.http-client:google-http-client-jackson2:1.19.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client-jackson2:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
,
Dependency convergence error for com.google.http-client:google-http-client-jackson:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-jackson:1.20.0
,
Dependency convergence error for com.google.api-client:google-api-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.api-client:google-api-client:1.20.0
,
Dependency convergence error for io.opencensus:opencensus-api:0.11.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-api:0.11.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-contrib-grpc-metrics:0.11.0
+-io.opencensus:opencensus-api:0.11.0
,
Dependency convergence error for com.google.oauth-client:google-oauth-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.oauth-client:google-oauth-client:1.20.0
,
Dependency convergence error for io.grpc:grpc-context:1.9.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-io.grpc:grpc-context:1.9.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.grpc:grpc-context:1.10.1
,
Dependency convergence error for com.google.http-client:google-http-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-appengine:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.http-client:google-http-client:1.20.0
]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.710 s
[INFO] Finished at: 2018-04-25T08:04:18-04:00
[INFO] Final Memory: 10M/159M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.4.1:enforce (enforce) on project dependencytest: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
```
|
1.0
|
Dependency convergence errors in datastore - It's not immediately obvious to me why I see this and the checks in this project's own pom.xml don't; but I definitely see this in a project that does nothing but import the datastore dependency from the google-cloud-java BOM and check dependency convergence:
```
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Dependency Tests for Google Cloud Libraries 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce) @ dependencytest ---
[WARNING]
Dependency convergence error for com.google.code.findbugs:jsr305:1.3.9 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:api-common:1.5.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:gax:1.23.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.code.findbugs:jsr305:3.0.0
[WARNING]
Dependency convergence error for com.google.protobuf:protobuf-java:3.5.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.protobuf:protobuf-java-util:3.5.1
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-common-protos:1.9.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-iam-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.api.grpc:proto-google-cloud-datastore-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.protobuf:protobuf-java:2.4.1
[WARNING]
Dependency convergence error for com.google.errorprone:error_prone_annotations:2.1.3 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.errorprone:error_prone_annotations:2.1.3
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.errorprone:error_prone_annotations:2.1.2
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client-jackson2:1.19.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client-jackson2:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client-jackson:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-jackson:1.20.0
[WARNING]
Dependency convergence error for com.google.api-client:google-api-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.api-client:google-api-client:1.20.0
[WARNING]
Dependency convergence error for io.opencensus:opencensus-api:0.11.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-api:0.11.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-contrib-grpc-metrics:0.11.0
+-io.opencensus:opencensus-api:0.11.0
[WARNING]
Dependency convergence error for com.google.oauth-client:google-oauth-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.oauth-client:google-oauth-client:1.20.0
[WARNING]
Dependency convergence error for io.grpc:grpc-context:1.9.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-io.grpc:grpc-context:1.9.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.grpc:grpc-context:1.10.1
[WARNING]
Dependency convergence error for com.google.http-client:google-http-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-appengine:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.http-client:google-http-client:1.20.0
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for com.google.code.findbugs:jsr305:1.3.9 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:api-common:1.5.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api:gax:1.23.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.code.findbugs:jsr305:1.3.9
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.code.findbugs:jsr305:3.0.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.code.findbugs:jsr305:3.0.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.code.findbugs:jsr305:3.0.0
,
Dependency convergence error for com.google.protobuf:protobuf-java:3.5.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.protobuf:protobuf-java-util:3.5.1
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-common-protos:1.9.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.api.grpc:proto-google-iam-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.api.grpc:proto-google-cloud-datastore-v1:0.10.0
+-com.google.protobuf:protobuf-java:3.5.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.protobuf:protobuf-java:2.4.1
,
Dependency convergence error for com.google.errorprone:error_prone_annotations:2.1.3 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.guava:guava:24.1-jre
+-com.google.errorprone:error_prone_annotations:2.1.3
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-com.google.errorprone:error_prone_annotations:2.2.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-com.google.errorprone:error_prone_annotations:2.1.2
,
Dependency convergence error for com.google.http-client:google-http-client-jackson2:1.19.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client-jackson2:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson2:1.23.0
,
Dependency convergence error for com.google.http-client:google-http-client-jackson:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-jackson:1.20.0
,
Dependency convergence error for com.google.api-client:google-api-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.api-client:google-api-client:1.20.0
,
Dependency convergence error for io.opencensus:opencensus-api:0.11.1 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-contrib-http-util:0.11.1
+-io.opencensus:opencensus-api:0.11.1
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-api:0.11.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.opencensus:opencensus-contrib-grpc-metrics:0.11.0
+-io.opencensus:opencensus-api:0.11.0
,
Dependency convergence error for com.google.oauth-client:google-oauth-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api-client:google-api-client:1.23.0
+-com.google.oauth-client:google-oauth-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.oauth-client:google-oauth-client:1.20.0
,
Dependency convergence error for io.grpc:grpc-context:1.9.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-io.opencensus:opencensus-api:0.11.1
+-io.grpc:grpc-context:1.9.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-io.grpc:grpc-core:1.10.1
+-io.grpc:grpc-context:1.10.1
,
Dependency convergence error for com.google.http-client:google-http-client:1.23.0 paths to dependency are:
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.auth:google-auth-library-oauth2-http:0.9.0
+-com.google.http-client:google-http-client:1.19.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.oauth-client:google-oauth-client:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-appengine:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.http-client:google-http-client-jackson:1.23.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud:google-cloud-core-http:1.27.0
+-com.google.api:gax-httpjson:0.40.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client:1.23.0
and
+-com.google.cloud.tools.dependencies:dependencytest:0.0.1-SNAPSHOT
+-com.google.cloud:google-cloud-datastore:1.27.0
+-com.google.cloud.datastore:datastore-v1-proto-client:1.6.0
+-com.google.http-client:google-http-client-protobuf:1.20.0
+-com.google.http-client:google-http-client:1.20.0
]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.710 s
[INFO] Finished at: 2018-04-25T08:04:18-04:00
[INFO] Final Memory: 10M/159M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.4.1:enforce (enforce) on project dependencytest: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
```
|
process
|
dependency convergence errors in datastore it s not immediately obvious to me why i see this and the checks in this project s own pom xml don t but i definitely see this in a project that does nothing but import the datastore dependency from the google cloud java bom and check dependency convergence scanning for projects building dependency tests for google cloud libraries snapshot maven enforcer plugin enforce enforce dependencytest dependency convergence error for com google code findbugs paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google guava guava jre com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api api common com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api gax com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api gax httpjson com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core com google code findbugs dependency convergence error for com google protobuf protobuf java paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google protobuf protobuf java util com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api grpc proto google common protos com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api grpc proto google iam com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google api grpc proto google cloud datastore com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client protobuf com google protobuf protobuf java dependency convergence error for com google errorprone error prone annotations paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google guava guava jre com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core com google errorprone error prone annotations dependency convergence error for com google http client google http client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google auth google auth library http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client dependency convergence error for com google http client google http client jackson paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client jackson and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client jackson dependency convergence error for com google api client google api client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google api client google api client dependency convergence error for io opencensus opencensus api paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io opencensus opencensus contrib grpc metrics io opencensus opencensus api dependency convergence error for com google oauth client google oauth client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client com google oauth client google oauth client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google oauth client google oauth client dependency convergence error for io grpc grpc context paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api io grpc grpc context and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io grpc grpc context dependency convergence error for com google http client google http client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google auth google auth library http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client appengine com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client jackson com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api gax httpjson com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client protobuf com google http client google http client rule org apache maven plugins enforcer dependencyconvergence failed with message failed while enforcing releasability the error s are dependency convergence error for com google code findbugs paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google guava guava jre com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api api common com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api gax com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api gax httpjson com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util com google code findbugs and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core com google code findbugs dependency convergence error for com google protobuf protobuf java paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google protobuf protobuf java util com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api grpc proto google common protos com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google api grpc proto google iam com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google api grpc proto google cloud datastore com google protobuf protobuf java and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client protobuf com google protobuf protobuf java dependency convergence error for com google errorprone error prone annotations paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google guava guava jre com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util com google errorprone error prone annotations and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core com google errorprone error prone annotations dependency convergence error for com google http client google http client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google auth google auth library http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client dependency convergence error for com google http client google http client jackson paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client jackson and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client jackson dependency convergence error for com google api client google api client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google api client google api client dependency convergence error for io opencensus opencensus api paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus contrib http util io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io opencensus opencensus api and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io opencensus opencensus contrib grpc metrics io opencensus opencensus api dependency convergence error for com google oauth client google oauth client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api client google api client com google oauth client google oauth client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google oauth client google oauth client dependency convergence error for io grpc grpc context paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http io opencensus opencensus api io grpc grpc context and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore io grpc grpc core io grpc grpc context dependency convergence error for com google http client google http client paths to dependency are com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google auth google auth library http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google oauth client google oauth client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client appengine com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google http client google http client jackson com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud google cloud core http com google api gax httpjson com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client and com google cloud tools dependencies dependencytest snapshot com google cloud google cloud datastore com google cloud datastore datastore proto client com google http client google http client protobuf com google http client google http client build failure total time s finished at final memory failed to execute goal org apache maven plugins maven enforcer plugin enforce enforce on project dependencytest some enforcer rules have failed look above for specific messages explaining why the rule failed to see the full stack trace of the errors re run maven with the e switch re run maven using the x switch to enable full debug logging for more information about the errors and possible solutions please read the following articles
| 1
|
31,183
| 6,443,910,982
|
IssuesEvent
|
2017-08-12 02:29:03
|
opendatakit/opendatakit
|
https://api.github.com/repos/opendatakit/opendatakit
|
closed
|
Erorr Display wen Redirect to IP Public
|
Aggregate Priority-Medium Type-Defect
|
Originally reported on Google Code with ID 1124
```
What steps will reproduce the problem?
1.I Just Deploy the ODK Aggregate on Windows Server 2012 R2
2.Install MYSQL
3. Install ODK
What is the expected output? What do you see instead?
setting up a localhost odk aggregate + mysql + tomcat.
Instead, I still can not display
What version of the product are you using? On what operating system?
ODK Version 1.4.5 on Windows Server 2012 R2 cloud server
apache-tomcat-6.0.41
mysql-installer-community-5.6.23
Java jdk-7u51-windows-x64
Java Runtime jre-7u45-windows-x64
Please provide any additional information below.
catalina log
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 1448 ms
Mar 31, 2015 11:59:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Mar 31, 2015 11:59:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.41
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor manager.xml
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive ODKAggregate.war
Mar 31, 2015 12:00:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory docs
Mar 31, 2015 12:00:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
Mar 31, 2015 12:00:03 PM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-8080
Mar 31, 2015 12:00:03 PM org.apache.jk.common.ChannelSocket init
INFO: JK: ajp13 listening on /0.0.0.0:8009
Mar 31, 2015 12:00:03 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/47 config=null
Mar 31, 2015 12:00:03 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 9110 ms
```
Reported by `lararashida` on 2015-03-31 17:42:54
<hr>
- _Attachment: ApacheManager.png<br>_
- _Attachment: errorDisplay.png<br>_
- _Attachment: RedirectLocalHost.png<br>_
|
1.0
|
Erorr Display wen Redirect to IP Public - Originally reported on Google Code with ID 1124
```
What steps will reproduce the problem?
1.I Just Deploy the ODK Aggregate on Windows Server 2012 R2
2.Install MYSQL
3. Install ODK
What is the expected output? What do you see instead?
setting up a localhost odk aggregate + mysql + tomcat.
Instead, I still can not display
What version of the product are you using? On what operating system?
ODK Version 1.4.5 on Windows Server 2012 R2 cloud server
apache-tomcat-6.0.41
mysql-installer-community-5.6.23
Java jdk-7u51-windows-x64
Java Runtime jre-7u45-windows-x64
Please provide any additional information below.
catalina log
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 1448 ms
Mar 31, 2015 11:59:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Mar 31, 2015 11:59:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.41
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor manager.xml
Mar 31, 2015 11:59:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive ODKAggregate.war
Mar 31, 2015 12:00:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory docs
Mar 31, 2015 12:00:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
Mar 31, 2015 12:00:03 PM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-8080
Mar 31, 2015 12:00:03 PM org.apache.jk.common.ChannelSocket init
INFO: JK: ajp13 listening on /0.0.0.0:8009
Mar 31, 2015 12:00:03 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/47 config=null
Mar 31, 2015 12:00:03 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 9110 ms
```
Reported by `lararashida` on 2015-03-31 17:42:54
<hr>
- _Attachment: ApacheManager.png<br>_
- _Attachment: errorDisplay.png<br>_
- _Attachment: RedirectLocalHost.png<br>_
|
non_process
|
erorr display wen redirect to ip public originally reported on google code with id what steps will reproduce the problem i just deploy the odk aggregate on windows server install mysql install odk what is the expected output what do you see instead setting up a localhost odk aggregate mysql tomcat instead i still can not display what version of the product are you using on what operating system odk version on windows server cloud server apache tomcat mysql installer community java jdk windows java runtime jre windows please provide any additional information below catalina log mar am org apache catalina startup catalina load info initialization processed in ms mar am org apache catalina core standardservice start info starting service catalina mar am org apache catalina core standardengine start info starting servlet engine apache tomcat mar am org apache catalina startup hostconfig deploydescriptor info deploying configuration descriptor manager xml mar am org apache catalina startup hostconfig deploywar info deploying web application archive odkaggregate war mar pm org apache catalina startup hostconfig deploydirectory info deploying web application directory docs mar pm org apache catalina startup hostconfig deploydirectory info deploying web application directory root mar pm org apache coyote start info starting coyote http on http mar pm org apache jk common channelsocket init info jk listening on mar pm org apache jk server jkmain start info jk running id time config null mar pm org apache catalina startup catalina start info server startup in ms reported by lararashida on attachment apachemanager png attachment errordisplay png attachment redirectlocalhost png
| 0
|
9,918
| 2,616,010,381
|
IssuesEvent
|
2015-03-02 00:53:53
|
jasonhall/bwapi
|
https://api.github.com/repos/jasonhall/bwapi
|
closed
|
BWAPI fails to load AIModule or crashes
|
auto-migrated Type-Defect
|
```
When I build the ExampleAIModule in Debug mode BWAPI fails to load it.
The LoadLibrary call in BWAPI Returns NULL and GetLastError says 998:
Invalid memory access. either something is wrong with dynamic linking or
the BWAPI::Init() call in the DLL makes problems:
http://support.microsoft.com/kb/196069
Sometimes there also seems to be a problem with BWTA at readMap().
When I used the debug BWAPI.dll with the Release BWAPI.lib together with
BWTA and ExampleAIModule in Release everything crashed on game start (maybe
readMap()??) I used the VC++ debugger and it stopped at GameImpl.cpp line
1007 after the ASM call: popad (GameImpl::printEx
)
```
Original issue reported on code.google.com by `tren...@gmail.com` on 19 Mar 2010 at 1:20
|
1.0
|
BWAPI fails to load AIModule or crashes - ```
When I build the ExampleAIModule in Debug mode BWAPI fails to load it.
The LoadLibrary call in BWAPI Returns NULL and GetLastError says 998:
Invalid memory access. either something is wrong with dynamic linking or
the BWAPI::Init() call in the DLL makes problems:
http://support.microsoft.com/kb/196069
Sometimes there also seems to be a problem with BWTA at readMap().
When I used the debug BWAPI.dll with the Release BWAPI.lib together with
BWTA and ExampleAIModule in Release everything crashed on game start (maybe
readMap()??) I used the VC++ debugger and it stopped at GameImpl.cpp line
1007 after the ASM call: popad (GameImpl::printEx
)
```
Original issue reported on code.google.com by `tren...@gmail.com` on 19 Mar 2010 at 1:20
|
non_process
|
bwapi fails to load aimodule or crashes when i build the exampleaimodule in debug mode bwapi fails to load it the loadlibrary call in bwapi returns null and getlasterror says invalid memory access either something is wrong with dynamic linking or the bwapi init call in the dll makes problems sometimes there also seems to be a problem with bwta at readmap when i used the debug bwapi dll with the release bwapi lib together with bwta and exampleaimodule in release everything crashed on game start maybe readmap i used the vc debugger and it stopped at gameimpl cpp line after the asm call popad gameimpl printex original issue reported on code google com by tren gmail com on mar at
| 0
|
18,862
| 24,783,515,237
|
IssuesEvent
|
2022-10-24 07:55:12
|
solid/specification
|
https://api.github.com/repos/solid/specification
|
closed
|
Strategy for test-suite
|
status: Needs Process Help
|
Creating this issue to bring to attention the need for a test suite strategy. Note that there was quite a bit of progress made some months ago by @kjetilk on https://github.com/solid/test-suite, but there hasn't been much activity of late, or use by others working in various areas of the specification and/or implementations.
Some things to consider:
- Agree whether we're going to adopt the test suite framework at https://github.com/solid/test-suite as is, identity adjustments needed to it to reach adoption, or decide that we want to go in another direction.
- Ensure people working on testable material are aware of how to work with and contribute to the test suite
|
1.0
|
Strategy for test-suite - Creating this issue to bring to attention the need for a test suite strategy. Note that there was quite a bit of progress made some months ago by @kjetilk on https://github.com/solid/test-suite, but there hasn't been much activity of late, or use by others working in various areas of the specification and/or implementations.
Some things to consider:
- Agree whether we're going to adopt the test suite framework at https://github.com/solid/test-suite as is, identity adjustments needed to it to reach adoption, or decide that we want to go in another direction.
- Ensure people working on testable material are aware of how to work with and contribute to the test suite
|
process
|
strategy for test suite creating this issue to bring to attention the need for a test suite strategy note that there was quite a bit of progress made some months ago by kjetilk on but there hasn t been much activity of late or use by others working in various areas of the specification and or implementations some things to consider agree whether we re going to adopt the test suite framework at as is identity adjustments needed to it to reach adoption or decide that we want to go in another direction ensure people working on testable material are aware of how to work with and contribute to the test suite
| 1
|
275,568
| 20,925,901,642
|
IssuesEvent
|
2022-03-24 22:52:50
|
capactio/capact
|
https://api.github.com/repos/capactio/capact
|
closed
|
Implement Go Template storage backend service
|
enhancement area/hub area/documentation
|
## Description
In Capact manifests, Content Developer is able to render `config` based on other TypeInstances data, see:
https://github.com/capactio/hub-manifests/blob/9987550e2341cc761c8613cc4e6bb716d5a328f1/manifests/implementation/gcp/cloudsql/postgresql/install-0.2.0.yaml#L102-L108
This is not supported directly in new external backends. We want to have a dedicated backend which will do this projection.
Read more on [Delegated TypeInstance projection](https://github.com/capactio/capact/blob/main/docs/proposal/20211207-delegated-storage.md#dynamic-typeinstance-projections) in delegated proposal.
## AC
- Dedicated backend storage for TypeInstance projection created under `cmd/` in Capact repo.
- Support Go templates
- Backend dedicated documentation under https://capact.io/docs/next/feature/storage-backends/introduction
## Related issues
See epic #604 for reason and use cases.
|
1.0
|
Implement Go Template storage backend service - ## Description
In Capact manifests, Content Developer is able to render `config` based on other TypeInstances data, see:
https://github.com/capactio/hub-manifests/blob/9987550e2341cc761c8613cc4e6bb716d5a328f1/manifests/implementation/gcp/cloudsql/postgresql/install-0.2.0.yaml#L102-L108
This is not supported directly in new external backends. We want to have a dedicated backend which will do this projection.
Read more on [Delegated TypeInstance projection](https://github.com/capactio/capact/blob/main/docs/proposal/20211207-delegated-storage.md#dynamic-typeinstance-projections) in delegated proposal.
## AC
- Dedicated backend storage for TypeInstance projection created under `cmd/` in Capact repo.
- Support Go templates
- Backend dedicated documentation under https://capact.io/docs/next/feature/storage-backends/introduction
## Related issues
See epic #604 for reason and use cases.
|
non_process
|
implement go template storage backend service description in capact manifests content developer is able to render config based on other typeinstances data see this is not supported directly in new external backends we want to have a dedicated backend which will do this projection read more on in delegated proposal ac dedicated backend storage for typeinstance projection created under cmd in capact repo support go templates backend dedicated documentation under related issues see epic for reason and use cases
| 0
|
128,682
| 18,070,022,569
|
IssuesEvent
|
2021-09-21 01:02:15
|
hugh-whitesource/NodeGoat-1
|
https://api.github.com/repos/hugh-whitesource/NodeGoat-1
|
closed
|
WS-2019-0231 (Medium) detected in adm-zip-0.4.4.tgz - autoclosed
|
security vulnerability
|
## WS-2019-0231 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>adm-zip-0.4.4.tgz</b></p></summary>
<p>A Javascript implementation of zip for nodejs. Allows user to create or extract zip files both in memory or to/from disk</p>
<p>Library home page: <a href="https://registry.npmjs.org/adm-zip/-/adm-zip-0.4.4.tgz">https://registry.npmjs.org/adm-zip/-/adm-zip-0.4.4.tgz</a></p>
<p>
Dependency Hierarchy:
- selenium-webdriver-2.53.3.tgz (Root Library)
- :x: **adm-zip-0.4.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/hugh-whitesource/NodeGoat-1/commit/30bd98bfe53110c5a2705fae9815d7ae157397db">30bd98bfe53110c5a2705fae9815d7ae157397db</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
adm-zip versions before 0.4.9 are vulnerable to Arbitrary File Write due to extraction of a specifically crafted archive that contains path traversal filenames
<p>Publish Date: 2018-04-22
<p>URL: <a href=https://hackerone.com/reports/362118>WS-2019-0231</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/994">https://www.npmjs.com/advisories/994</a></p>
<p>Release Date: 2019-09-09</p>
<p>Fix Resolution: 0.4.9</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"adm-zip","packageVersion":"0.4.4","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"selenium-webdriver:2.53.3;adm-zip:0.4.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.4.9"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2019-0231","vulnerabilityDetails":"adm-zip versions before 0.4.9 are vulnerable to Arbitrary File Write due to extraction of a specifically crafted archive that contains path traversal filenames","vulnerabilityUrl":"https://hackerone.com/reports/362118","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> -->
|
True
|
WS-2019-0231 (Medium) detected in adm-zip-0.4.4.tgz - autoclosed - ## WS-2019-0231 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>adm-zip-0.4.4.tgz</b></p></summary>
<p>A Javascript implementation of zip for nodejs. Allows user to create or extract zip files both in memory or to/from disk</p>
<p>Library home page: <a href="https://registry.npmjs.org/adm-zip/-/adm-zip-0.4.4.tgz">https://registry.npmjs.org/adm-zip/-/adm-zip-0.4.4.tgz</a></p>
<p>
Dependency Hierarchy:
- selenium-webdriver-2.53.3.tgz (Root Library)
- :x: **adm-zip-0.4.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/hugh-whitesource/NodeGoat-1/commit/30bd98bfe53110c5a2705fae9815d7ae157397db">30bd98bfe53110c5a2705fae9815d7ae157397db</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
adm-zip versions before 0.4.9 are vulnerable to Arbitrary File Write due to extraction of a specifically crafted archive that contains path traversal filenames
<p>Publish Date: 2018-04-22
<p>URL: <a href=https://hackerone.com/reports/362118>WS-2019-0231</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/994">https://www.npmjs.com/advisories/994</a></p>
<p>Release Date: 2019-09-09</p>
<p>Fix Resolution: 0.4.9</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"adm-zip","packageVersion":"0.4.4","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"selenium-webdriver:2.53.3;adm-zip:0.4.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.4.9"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2019-0231","vulnerabilityDetails":"adm-zip versions before 0.4.9 are vulnerable to Arbitrary File Write due to extraction of a specifically crafted archive that contains path traversal filenames","vulnerabilityUrl":"https://hackerone.com/reports/362118","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> -->
|
non_process
|
ws medium detected in adm zip tgz autoclosed ws medium severity vulnerability vulnerable library adm zip tgz a javascript implementation of zip for nodejs allows user to create or extract zip files both in memory or to from disk library home page a href dependency hierarchy selenium webdriver tgz root library x adm zip tgz vulnerable library found in head commit a href found in base branch master vulnerability details adm zip versions before are vulnerable to arbitrary file write due to extraction of a specifically crafted archive that contains path traversal filenames publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree selenium webdriver adm zip isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails adm zip versions before are vulnerable to arbitrary file write due to extraction of a specifically crafted archive that contains path traversal filenames vulnerabilityurl
| 0
|
17,166
| 22,743,192,477
|
IssuesEvent
|
2022-07-07 06:43:34
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
opened
|
Remove LogStream writers from Engine
|
kind/toil team/distributed team/process-automation area/maintainability
|
**Description**
Part of #9600
In the current state the TypedProcessors (which might become the Engine later) and other entities, like [JobTimeoutTrigger](https://github.com/camunda/zeebe/blob/2d4901c0e516b7fa8b1cf3408b06708c1644ca57/engine/src/main/java/io/camunda/zeebe/engine/processing/job/JobTimeoutTrigger.java#L79-L82), DeploymentDistributor etc. have knowledge about how to write Records to the LogStream abstraction. Ideally, they shouldn't care about that detail. In the end, it would be great if the Engine can get something in and produce something out. For that we have to do some pre-work like removing the actual LogStreamBatchWriters usage and reduce related interfaces.
In the POC #9602 we split up the implementation of the `TypedStreamWriterImpl` so that it just writes into a wrapped buffer. This allowed to pre-claim that buffer in the start, and initialize the Engine with that Writer. This means we can reduce the dependency to the LogStream (helps when we split the Engine and the StreamPlatform). Later we renamed the Writers to something like Builder, but this is discussable.
**Todo:**
- [ ] In order to work on this in parallel we copy the `TypedStreamWriterImpl` to a new class and make certain changes
- [ ] Copy the content from `TypedStreamWriterImpl` to a new class
- [ ] Remove the LogStreamBatch usage from that class, write directly into a pre-claimed buffer
- [ ] Add a new method to the LogStreamBatch
- [ ] Write tests for both
- [ ] Find a good new name for the Writers Interface, including Command-, Rejection-, StateWriter. In the POC #9602, we called it Builders, because wanted in the end build a list of Records. But maybe we can also use a different name here.
- [ ] Create a new Result Interface / Class which can be returned by the Engine
- [ ] The Result can for simplicity return a BufferWriter like we did in the POC #9602 (we can improve that later), but we can also just return a list of Records if you find a good way for it
- [ ] Result should be used by Processing and ScheduledTasks to return Results
- [ ] Engine should return the result - :warning: here we need the Stream Processing changes first
- [ ] Use result in platform, write the records to the LogStreamBatchWriter
- [ ] **Bonus:** The result contains only the list of the records (instead of the general Buffer or BufferWriter)
- [ ] **Bonus2** The serialization is done in the StreamPlatform on writing to the LogStreamBatchWriter
- [ ] We might need to adjust some tests
|
1.0
|
Remove LogStream writers from Engine - **Description**
Part of #9600
In the current state the TypedProcessors (which might become the Engine later) and other entities, like [JobTimeoutTrigger](https://github.com/camunda/zeebe/blob/2d4901c0e516b7fa8b1cf3408b06708c1644ca57/engine/src/main/java/io/camunda/zeebe/engine/processing/job/JobTimeoutTrigger.java#L79-L82), DeploymentDistributor etc. have knowledge about how to write Records to the LogStream abstraction. Ideally, they shouldn't care about that detail. In the end, it would be great if the Engine can get something in and produce something out. For that we have to do some pre-work like removing the actual LogStreamBatchWriters usage and reduce related interfaces.
In the POC #9602 we split up the implementation of the `TypedStreamWriterImpl` so that it just writes into a wrapped buffer. This allowed to pre-claim that buffer in the start, and initialize the Engine with that Writer. This means we can reduce the dependency to the LogStream (helps when we split the Engine and the StreamPlatform). Later we renamed the Writers to something like Builder, but this is discussable.
**Todo:**
- [ ] In order to work on this in parallel we copy the `TypedStreamWriterImpl` to a new class and make certain changes
- [ ] Copy the content from `TypedStreamWriterImpl` to a new class
- [ ] Remove the LogStreamBatch usage from that class, write directly into a pre-claimed buffer
- [ ] Add a new method to the LogStreamBatch
- [ ] Write tests for both
- [ ] Find a good new name for the Writers Interface, including Command-, Rejection-, StateWriter. In the POC #9602, we called it Builders, because wanted in the end build a list of Records. But maybe we can also use a different name here.
- [ ] Create a new Result Interface / Class which can be returned by the Engine
- [ ] The Result can for simplicity return a BufferWriter like we did in the POC #9602 (we can improve that later), but we can also just return a list of Records if you find a good way for it
- [ ] Result should be used by Processing and ScheduledTasks to return Results
- [ ] Engine should return the result - :warning: here we need the Stream Processing changes first
- [ ] Use result in platform, write the records to the LogStreamBatchWriter
- [ ] **Bonus:** The result contains only the list of the records (instead of the general Buffer or BufferWriter)
- [ ] **Bonus2** The serialization is done in the StreamPlatform on writing to the LogStreamBatchWriter
- [ ] We might need to adjust some tests
|
process
|
remove logstream writers from engine description part of in the current state the typedprocessors which might become the engine later and other entities like deploymentdistributor etc have knowledge about how to write records to the logstream abstraction ideally they shouldn t care about that detail in the end it would be great if the engine can get something in and produce something out for that we have to do some pre work like removing the actual logstreambatchwriters usage and reduce related interfaces in the poc we split up the implementation of the typedstreamwriterimpl so that it just writes into a wrapped buffer this allowed to pre claim that buffer in the start and initialize the engine with that writer this means we can reduce the dependency to the logstream helps when we split the engine and the streamplatform later we renamed the writers to something like builder but this is discussable todo in order to work on this in parallel we copy the typedstreamwriterimpl to a new class and make certain changes copy the content from typedstreamwriterimpl to a new class remove the logstreambatch usage from that class write directly into a pre claimed buffer add a new method to the logstreambatch write tests for both find a good new name for the writers interface including command rejection statewriter in the poc we called it builders because wanted in the end build a list of records but maybe we can also use a different name here create a new result interface class which can be returned by the engine the result can for simplicity return a bufferwriter like we did in the poc we can improve that later but we can also just return a list of records if you find a good way for it result should be used by processing and scheduledtasks to return results engine should return the result warning here we need the stream processing changes first use result in platform write the records to the logstreambatchwriter bonus the result contains only the list of the records instead of the general buffer or bufferwriter the serialization is done in the streamplatform on writing to the logstreambatchwriter we might need to adjust some tests
| 1
|
18,718
| 5,696,544,882
|
IssuesEvent
|
2017-04-16 13:00:17
|
javaparser/javaparser
|
https://api.github.com/repos/javaparser/javaparser
|
closed
|
Generators should sort properties (minor)
|
Feature request Metamodel/code generation
|
In this way we would not have irrelevant changes in generated code
|
1.0
|
Generators should sort properties (minor) - In this way we would not have irrelevant changes in generated code
|
non_process
|
generators should sort properties minor in this way we would not have irrelevant changes in generated code
| 0
|
13,377
| 15,838,344,615
|
IssuesEvent
|
2021-04-06 22:18:03
|
90301/TextReplace
|
https://api.github.com/repos/90301/TextReplace
|
closed
|
Line Instance Count
|
Log Processor
|
LineMatchCount function:
- Text to match
- Min number of instances per line
- [Optional] > or <
If a match is found:
Return the line.
Possibly add another that counts the instances per line and returns the count.
Ex:
LineMatchCount( [ , 2 )
this[1][2] <---- would trigger
this[1] <----- would not trigger
|
1.0
|
Line Instance Count - LineMatchCount function:
- Text to match
- Min number of instances per line
- [Optional] > or <
If a match is found:
Return the line.
Possibly add another that counts the instances per line and returns the count.
Ex:
LineMatchCount( [ , 2 )
this[1][2] <---- would trigger
this[1] <----- would not trigger
|
process
|
line instance count linematchcount function text to match min number of instances per line or if a match is found return the line possibly add another that counts the instances per line and returns the count ex linematchcount this would trigger this would not trigger
| 1
|
126,091
| 26,778,635,947
|
IssuesEvent
|
2023-01-31 19:11:16
|
SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363
|
https://api.github.com/repos/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363
|
opened
|
Code Security Report: 24 high severity findings, 27 total findings
|
code security findings
|
# Code Security Report
**Latest Scan:** 2023-01-31 07:10pm
**Total Findings:** 27
**Tested Project Files:** 117
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: JavaScript / Node.js
> No vulnerability findings detected.
## Language: Python
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|24|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-798](https://cwe.mitre.org/data/definitions/798.html)|Hardcoded Password/Credentials|3|
### Details
> The below list presents the 20 most relevant findings that need your attention. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/8317542d-33e0-42c1-8b5f-67cfab63cd5a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 20</summary>
#### Findings
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L45
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L45
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:53</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L48-L53
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L53
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_api.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libapi.py#L8
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
</details>
|
1.0
|
Code Security Report: 24 high severity findings, 27 total findings - # Code Security Report
**Latest Scan:** 2023-01-31 07:10pm
**Total Findings:** 27
**Tested Project Files:** 117
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: JavaScript / Node.js
> No vulnerability findings detected.
## Language: Python
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|24|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-798](https://cwe.mitre.org/data/definitions/798.html)|Hardcoded Password/Credentials|3|
### Details
> The below list presents the 20 most relevant findings that need your attention. To view information on the remaining findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/8317542d-33e0-42c1-8b5f-67cfab63cd5a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 20</summary>
#### Findings
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L45
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L45
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:53</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L48-L53
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L53
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:25</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20-L25
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L46
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L25
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L69
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L16
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L17
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_user.py#L20
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
<details>
<summary>bad/libuser.py:12</summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L7-L12
<details>
<summary> Trace </summary>
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/mod_api.py#L32
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libapi.py#L8
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L5
https://github.com/SAST-org/SAST-Test-Repo-a8102749-e741-4799-bcc4-443eaad2f363/blob/427f62ad368ac44f5cf98c63c5c90e81f24a5437/bad/libuser.py#L12
</details>
</details>
</details>
|
non_process
|
code security report high severity findings total findings code security report latest scan total findings tested project files detected programming languages check this box to manually trigger a scan language javascript node js no vulnerability findings detected language python severity cwe vulnerability type count high injection medium password credentials details the below list presents the most relevant findings that need your attention to view information on the remaining findings navigate to the sql injection cwe findings bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace bad libuser py trace
| 0
|
218,147
| 7,330,614,761
|
IssuesEvent
|
2018-03-05 10:31:50
|
NCEAS/metacat
|
https://api.github.com/repos/NCEAS/metacat
|
closed
|
automatically insert pubDate on registry submission
|
Category: registry Component: Bugzilla-Id Priority: Normal Status: Resolved Tracker: Bug
|
---
Author Name: **Callie Bowdish** (Callie Bowdish)
Original Redmine Issue: 2211, https://projects.ecoinformatics.org/ecoinfo/issues/2211
Original Date: 2005-09-28
Original Assignee: Saurabh Garg
---
We need to automatically insert pubDate on registry submission. The 'pubDate'
field represents the date that the resource was published. The format should be
represented as: CCYY, which represents a 4 digit year, or as CCYY-MM-DD, which
denotes the full year, month, and day. Note that month and day are optional
components. Formats must conform to ISO 8601
Updated and new version of data packages that have been submitted to the
registry will have a new pubDate.
IRQ conversation, " yeah, an updated document is really a new version with a new
date the (modified) resource was published <matt> so it should proabbly be
updated <matt> the original revision will still have the original pubDate
<matt> so nothing is lost"
Note: This will be used for citations and will be used with citation information
in stylesheets along with the lsid on the ESA stylesheet.
|
1.0
|
automatically insert pubDate on registry submission - ---
Author Name: **Callie Bowdish** (Callie Bowdish)
Original Redmine Issue: 2211, https://projects.ecoinformatics.org/ecoinfo/issues/2211
Original Date: 2005-09-28
Original Assignee: Saurabh Garg
---
We need to automatically insert pubDate on registry submission. The 'pubDate'
field represents the date that the resource was published. The format should be
represented as: CCYY, which represents a 4 digit year, or as CCYY-MM-DD, which
denotes the full year, month, and day. Note that month and day are optional
components. Formats must conform to ISO 8601
Updated and new version of data packages that have been submitted to the
registry will have a new pubDate.
IRQ conversation, " yeah, an updated document is really a new version with a new
date the (modified) resource was published <matt> so it should proabbly be
updated <matt> the original revision will still have the original pubDate
<matt> so nothing is lost"
Note: This will be used for citations and will be used with citation information
in stylesheets along with the lsid on the ESA stylesheet.
|
non_process
|
automatically insert pubdate on registry submission author name callie bowdish callie bowdish original redmine issue original date original assignee saurabh garg we need to automatically insert pubdate on registry submission the pubdate field represents the date that the resource was published the format should be represented as ccyy which represents a digit year or as ccyy mm dd which denotes the full year month and day note that month and day are optional components formats must conform to iso updated and new version of data packages that have been submitted to the registry will have a new pubdate irq conversation yeah an updated document is really a new version with a new date the modified resource was published so it should proabbly be updated the original revision will still have the original pubdate so nothing is lost note this will be used for citations and will be used with citation information in stylesheets along with the lsid on the esa stylesheet
| 0
|
2,796
| 5,724,417,765
|
IssuesEvent
|
2017-04-20 14:30:02
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
reopened
|
Add timeout parameter to the SSHClient
|
process_duplicate type_feature
|
Can you add a timeout parameter in SSHClient?
We use the SShClient run command to verify some stuff for the external arakoons but if one node is down we receive a timeout error.
It takes 248 sec before the timeout error appears.
```
externalArakoons : Install config Arakoons ---------------------------- 248.39s
```
```
2017-04-19 16:50:14 95700 +0200 - ovs-node01-1604 - 710/140621697152768 - arakoon_client/pyrakoon - 0 - ERROR - timed out: Unable to connect to ('10.100.199.182', 26400)\r
Traceback (most recent call last):\r
File \"/opt/OpenvStorage/ovs/extensions/db/arakoon/pyrakoon/pyrakoon/compat.py\", line 1262, in connect\r
self._socket = socket.create_connection(self._address, self._timeout)\r
File \"/usr/lib/python2.7/socket.py\", line 575, in create_connection\r
raise err\r
timeout: timed out\r
2017-04-19 16:51:14 97000 +0200 - ovs-node01-1604 - 710/140621697152768 - arakoon_client/pyrakoon - 1 - ERROR - timed out: Unable to connect to ('10.100.199.182', 26400)\r
Traceback (most recent call last):\r
File \"/opt/OpenvStorage/ovs/extensions/db/arakoon/pyrakoon/pyrakoon/compat.py\", line 1262, in connect\r
self._socket = socket.create_connection(self._address, self._timeout)\r
File \"/usr/lib/python2.7/socket.py\", line 575, in create_connection\r
raise err\r
timeout: timed out\r
```
|
1.0
|
Add timeout parameter to the SSHClient - Can you add a timeout parameter in SSHClient?
We use the SShClient run command to verify some stuff for the external arakoons but if one node is down we receive a timeout error.
It takes 248 sec before the timeout error appears.
```
externalArakoons : Install config Arakoons ---------------------------- 248.39s
```
```
2017-04-19 16:50:14 95700 +0200 - ovs-node01-1604 - 710/140621697152768 - arakoon_client/pyrakoon - 0 - ERROR - timed out: Unable to connect to ('10.100.199.182', 26400)\r
Traceback (most recent call last):\r
File \"/opt/OpenvStorage/ovs/extensions/db/arakoon/pyrakoon/pyrakoon/compat.py\", line 1262, in connect\r
self._socket = socket.create_connection(self._address, self._timeout)\r
File \"/usr/lib/python2.7/socket.py\", line 575, in create_connection\r
raise err\r
timeout: timed out\r
2017-04-19 16:51:14 97000 +0200 - ovs-node01-1604 - 710/140621697152768 - arakoon_client/pyrakoon - 1 - ERROR - timed out: Unable to connect to ('10.100.199.182', 26400)\r
Traceback (most recent call last):\r
File \"/opt/OpenvStorage/ovs/extensions/db/arakoon/pyrakoon/pyrakoon/compat.py\", line 1262, in connect\r
self._socket = socket.create_connection(self._address, self._timeout)\r
File \"/usr/lib/python2.7/socket.py\", line 575, in create_connection\r
raise err\r
timeout: timed out\r
```
|
process
|
add timeout parameter to the sshclient can you add a timeout parameter in sshclient we use the sshclient run command to verify some stuff for the external arakoons but if one node is down we receive a timeout error it takes sec before the timeout error appears externalarakoons install config arakoons ovs arakoon client pyrakoon error timed out unable to connect to r traceback most recent call last r file opt openvstorage ovs extensions db arakoon pyrakoon pyrakoon compat py line in connect r self socket socket create connection self address self timeout r file usr lib socket py line in create connection r raise err r timeout timed out r ovs arakoon client pyrakoon error timed out unable to connect to r traceback most recent call last r file opt openvstorage ovs extensions db arakoon pyrakoon pyrakoon compat py line in connect r self socket socket create connection self address self timeout r file usr lib socket py line in create connection r raise err r timeout timed out r
| 1
|
82,080
| 3,602,667,322
|
IssuesEvent
|
2016-02-03 16:23:02
|
rh-lab-q/rpg
|
https://api.github.com/repos/rh-lab-q/rpg
|
opened
|
c plugin (and any other compiled lang) plugin should uncheck BuildArch: noarch option
|
low_priority
|
It should be checked by default and if any plugin find compilable file then it mark it it in Spec object.
|
1.0
|
c plugin (and any other compiled lang) plugin should uncheck BuildArch: noarch option - It should be checked by default and if any plugin find compilable file then it mark it it in Spec object.
|
non_process
|
c plugin and any other compiled lang plugin should uncheck buildarch noarch option it should be checked by default and if any plugin find compilable file then it mark it it in spec object
| 0
|
14,068
| 16,890,535,793
|
IssuesEvent
|
2021-06-23 08:43:13
|
arcus-azure/arcus.messaging
|
https://api.github.com/repos/arcus-azure/arcus.messaging
|
closed
|
Provide more flexible message parsing in message pump
|
area:message-processing feature
|
**Is your feature request related to a problem? Please describe.**
When adding new properties to the messaging model we do not always remember to update message consumers' projects
Because of that, the messages are no longer being processed by the defined message handler because it shows the following verbose level log:
```
Message handler 'X' is not able to process the message because the incoming message cannot be deserialized to the message that the message handler can handle
```
As a result, the message is moved on and handled by the fallback message handler.
**Describe the solution you'd like**
To avoid causing disrupting behavior it would be nice to have a flag that toggles LAX parsing, for example in the extension method: **AddServiceBusQueueMessagePump** adding a parameter i.e.: laxParsingEnabled = false so that existing implementations are not directly impacted but that we could easily switch on.
It would also be nice to see the above message being presented as an Error in the logs, making it easier to intercept it without having to toggle Logging levels.
**Additional context**
Our message contract:
```json
{
"Imei": "10.34.0.16:4096",
"FirmwareVersion": "efefad25-0d9f-4745-88d9-99b040b80c54",
"DeviceId": "352554100004969",
"Ip": "3.32.16.5",
"MessageReceivedDateTime": "2021-06-15T14:02:35.190Z"
}
```
Received message payload:
```diff
{
"Imei": "10.34.0.16:4096",
"FirmwareVersion": "efefad25-0d9f-4745-88d9-99b040b80c54",
"DeviceId": "352554100004969",
"Ip": "3.32.16.5",
+ "Status": null,
"MessageReceivedDateTime": "2021-06-15T14:02:35.190Z"
}
```
|
1.0
|
Provide more flexible message parsing in message pump - **Is your feature request related to a problem? Please describe.**
When adding new properties to the messaging model we do not always remember to update message consumers' projects
Because of that, the messages are no longer being processed by the defined message handler because it shows the following verbose level log:
```
Message handler 'X' is not able to process the message because the incoming message cannot be deserialized to the message that the message handler can handle
```
As a result, the message is moved on and handled by the fallback message handler.
**Describe the solution you'd like**
To avoid causing disrupting behavior it would be nice to have a flag that toggles LAX parsing, for example in the extension method: **AddServiceBusQueueMessagePump** adding a parameter i.e.: laxParsingEnabled = false so that existing implementations are not directly impacted but that we could easily switch on.
It would also be nice to see the above message being presented as an Error in the logs, making it easier to intercept it without having to toggle Logging levels.
**Additional context**
Our message contract:
```json
{
"Imei": "10.34.0.16:4096",
"FirmwareVersion": "efefad25-0d9f-4745-88d9-99b040b80c54",
"DeviceId": "352554100004969",
"Ip": "3.32.16.5",
"MessageReceivedDateTime": "2021-06-15T14:02:35.190Z"
}
```
Received message payload:
```diff
{
"Imei": "10.34.0.16:4096",
"FirmwareVersion": "efefad25-0d9f-4745-88d9-99b040b80c54",
"DeviceId": "352554100004969",
"Ip": "3.32.16.5",
+ "Status": null,
"MessageReceivedDateTime": "2021-06-15T14:02:35.190Z"
}
```
|
process
|
provide more flexible message parsing in message pump is your feature request related to a problem please describe when adding new properties to the messaging model we do not always remember to update message consumers projects because of that the messages are no longer being processed by the defined message handler because it shows the following verbose level log message handler x is not able to process the message because the incoming message cannot be deserialized to the message that the message handler can handle as a result the message is moved on and handled by the fallback message handler describe the solution you d like to avoid causing disrupting behavior it would be nice to have a flag that toggles lax parsing for example in the extension method addservicebusqueuemessagepump adding a parameter i e laxparsingenabled false so that existing implementations are not directly impacted but that we could easily switch on it would also be nice to see the above message being presented as an error in the logs making it easier to intercept it without having to toggle logging levels additional context our message contract json imei firmwareversion deviceid ip messagereceiveddatetime received message payload diff imei firmwareversion deviceid ip status null messagereceiveddatetime
| 1
|
156,950
| 24,627,563,716
|
IssuesEvent
|
2022-10-16 18:09:35
|
dotnet/efcore
|
https://api.github.com/repos/dotnet/efcore
|
closed
|
Backing field is null when lazy loading hasmany collection
|
closed-by-design customer-reported
|
Noticed that backing field collection is null if the property is configured to be lazy loaded. Once you access the property then the backing field gets loaded.
```C#
public class Blog
{
protected Blog() { }
public int BlogId { get; private set; }
public string BlogName { get; private set; }
private IList<Post> _posts;
public virtual IEnumerable<Post> Posts => _posts;
public void DeletePost(int postId)
{
_posts.Remove(_posts.FirstOrDefault(postId => p.PostId == postId));
}
}
public class Post
{
protected Post() { }
public int PostId { get; private set; }
public string PostTitle {get; private set; }
public int BlogId { get; private set; }
}
public class BlogMap : IEntityTypeConfiguration<Blog>
{
public void Configure(EntityTypeBuilder<Blog> builder)
{
builder.HasKey(blog => blog.BlogId);
builder.HasMany(blog => blog.Posts).WithOne().HasForeignKey(pm => pm.BlogId)
.Metadata.PrincipalToDependent.SetPropertyAccessMode(PropertyAccessMode.Field);
}
}
```
In above example, accessing _posts in DeletePost method returns null. If I hit a breakpoint before accesing _posts and tries to lookup Posts (lazy loading) then _posts also gets filled in.
|
1.0
|
Backing field is null when lazy loading hasmany collection - Noticed that backing field collection is null if the property is configured to be lazy loaded. Once you access the property then the backing field gets loaded.
```C#
public class Blog
{
protected Blog() { }
public int BlogId { get; private set; }
public string BlogName { get; private set; }
private IList<Post> _posts;
public virtual IEnumerable<Post> Posts => _posts;
public void DeletePost(int postId)
{
_posts.Remove(_posts.FirstOrDefault(postId => p.PostId == postId));
}
}
public class Post
{
protected Post() { }
public int PostId { get; private set; }
public string PostTitle {get; private set; }
public int BlogId { get; private set; }
}
public class BlogMap : IEntityTypeConfiguration<Blog>
{
public void Configure(EntityTypeBuilder<Blog> builder)
{
builder.HasKey(blog => blog.BlogId);
builder.HasMany(blog => blog.Posts).WithOne().HasForeignKey(pm => pm.BlogId)
.Metadata.PrincipalToDependent.SetPropertyAccessMode(PropertyAccessMode.Field);
}
}
```
In above example, accessing _posts in DeletePost method returns null. If I hit a breakpoint before accesing _posts and tries to lookup Posts (lazy loading) then _posts also gets filled in.
|
non_process
|
backing field is null when lazy loading hasmany collection noticed that backing field collection is null if the property is configured to be lazy loaded once you access the property then the backing field gets loaded c public class blog protected blog public int blogid get private set public string blogname get private set private ilist posts public virtual ienumerable posts posts public void deletepost int postid posts remove posts firstordefault postid p postid postid public class post protected post public int postid get private set public string posttitle get private set public int blogid get private set public class blogmap ientitytypeconfiguration public void configure entitytypebuilder builder builder haskey blog blog blogid builder hasmany blog blog posts withone hasforeignkey pm pm blogid metadata principaltodependent setpropertyaccessmode propertyaccessmode field in above example accessing posts in deletepost method returns null if i hit a breakpoint before accesing posts and tries to lookup posts lazy loading then posts also gets filled in
| 0
|
17,277
| 23,067,845,172
|
IssuesEvent
|
2022-07-25 15:17:29
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
ec2 resourcedetectionprocessor: Support configuring the http client used
|
processor/resourcedetection
|
**Is your feature request related to a problem? Please describe.**
The resource detection processors don't support configuring the http client they use while other exporters and receivers do. This is an issue when you want to configure settings like the http client timeout for processors.
**Describe the solution you'd like**
In the long run, we should make all the resource detection processor http clients configurable by the user. We should have the http clients switch to using the [opentelemetry HTTPClientSettings](https://github.com/open-telemetry/opentelemetry-collector/blob/main/config/confighttp/confighttp.go#L36 ) and support the user defining these HTTPClientSettings via a config file.
Here is an example of how this is done in the rabbitmq receiver.
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/595b1dd55784e3ef177ab4af748258b88d5fde19/receiver/rabbitmqreceiver/config.go#L42
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/595b1dd55784e3ef177ab4af748258b88d5fde19/receiver/rabbitmqreceiver/client.go#L53
For the scope of this ticket, I suggest only making the ec2 resource detection processor http configurable and file new tickets for the others processors that need to be updated.
**Describe alternatives you've considered**
I haven't come up with any other ways.
**Additional context**
- Some things to consider. The ecs resource detector already uses some AWS utils to setup http clients, perhaps AWS code owners would want to refactor these utils to work with the ec2 detector. This may or may not be worth doing.
- I have a customer at my company needing/requesting this for a support case.
- I (jvoravong) am available to work on this ticket.
|
1.0
|
ec2 resourcedetectionprocessor: Support configuring the http client used - **Is your feature request related to a problem? Please describe.**
The resource detection processors don't support configuring the http client they use while other exporters and receivers do. This is an issue when you want to configure settings like the http client timeout for processors.
**Describe the solution you'd like**
In the long run, we should make all the resource detection processor http clients configurable by the user. We should have the http clients switch to using the [opentelemetry HTTPClientSettings](https://github.com/open-telemetry/opentelemetry-collector/blob/main/config/confighttp/confighttp.go#L36 ) and support the user defining these HTTPClientSettings via a config file.
Here is an example of how this is done in the rabbitmq receiver.
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/595b1dd55784e3ef177ab4af748258b88d5fde19/receiver/rabbitmqreceiver/config.go#L42
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/595b1dd55784e3ef177ab4af748258b88d5fde19/receiver/rabbitmqreceiver/client.go#L53
For the scope of this ticket, I suggest only making the ec2 resource detection processor http configurable and file new tickets for the others processors that need to be updated.
**Describe alternatives you've considered**
I haven't come up with any other ways.
**Additional context**
- Some things to consider. The ecs resource detector already uses some AWS utils to setup http clients, perhaps AWS code owners would want to refactor these utils to work with the ec2 detector. This may or may not be worth doing.
- I have a customer at my company needing/requesting this for a support case.
- I (jvoravong) am available to work on this ticket.
|
process
|
resourcedetectionprocessor support configuring the http client used is your feature request related to a problem please describe the resource detection processors don t support configuring the http client they use while other exporters and receivers do this is an issue when you want to configure settings like the http client timeout for processors describe the solution you d like in the long run we should make all the resource detection processor http clients configurable by the user we should have the http clients switch to using the and support the user defining these httpclientsettings via a config file here is an example of how this is done in the rabbitmq receiver for the scope of this ticket i suggest only making the resource detection processor http configurable and file new tickets for the others processors that need to be updated describe alternatives you ve considered i haven t come up with any other ways additional context some things to consider the ecs resource detector already uses some aws utils to setup http clients perhaps aws code owners would want to refactor these utils to work with the detector this may or may not be worth doing i have a customer at my company needing requesting this for a support case i jvoravong am available to work on this ticket
| 1
|
17,870
| 23,814,254,681
|
IssuesEvent
|
2022-09-05 04:01:27
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
A validity time period is not correct at "Update a webhook"
|
automation/svc triaged cxp doc-enhancement process-automation/subsvc Pri2
|
It is "1 year" not "10 years" at section "Update a webhook". Could you fix it?
_When a webhook is created, it has a validity time period of 10 years, after which it automatically expires_
https://docs.microsoft.com/en-us/azure/automation/automation-webhooks?tabs=portal#update-a-webhook
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7a6394c7-9bef-b8f8-ffd6-9d9d8e2daa07
* Version Independent ID: 5ffa20a2-436c-2726-dc57-9d3b49f9ca39
* Content: [Start an Azure Automation runbook from a webhook](https://docs.microsoft.com/en-us/azure/automation/automation-webhooks?tabs=portal#update-a-webhook)
* Content Source: [articles/automation/automation-webhooks.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-webhooks.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
1.0
|
A validity time period is not correct at "Update a webhook" - It is "1 year" not "10 years" at section "Update a webhook". Could you fix it?
_When a webhook is created, it has a validity time period of 10 years, after which it automatically expires_
https://docs.microsoft.com/en-us/azure/automation/automation-webhooks?tabs=portal#update-a-webhook
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7a6394c7-9bef-b8f8-ffd6-9d9d8e2daa07
* Version Independent ID: 5ffa20a2-436c-2726-dc57-9d3b49f9ca39
* Content: [Start an Azure Automation runbook from a webhook](https://docs.microsoft.com/en-us/azure/automation/automation-webhooks?tabs=portal#update-a-webhook)
* Content Source: [articles/automation/automation-webhooks.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-webhooks.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
process
|
a validity time period is not correct at update a webhook it is year not years at section update a webhook could you fix it when a webhook is created it has a validity time period of years after which it automatically expires document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehasudhirg microsoft alias sudhirsneha
| 1
|
48,654
| 12,227,178,887
|
IssuesEvent
|
2020-05-03 14:14:30
|
stitchEm/stitchEm
|
https://api.github.com/repos/stitchEm/stitchEm
|
closed
|
Jetson Nano cmake Error
|
Build
|
I'm attempting to get StitchEm Studio running on a Nvidia Jetson Nano. I'm running into it complaining that several variables are used in the project but they are not set:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
FFMPEG_avcodec
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
FFMPEG_avformat
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
FFMPEG_avutil
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
TIFF
linked by target "tiffPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/tiff
I've tried a couple different cmake with -DISABLE_AV=ON and with -DISABLE_TIFF=ON
cmake -DGPU_BACKEND_CUDA=ON -DGPU_BACKEND_OPENCL=OFF -DISABLE_AV=ON -DDISABLE_RTMP=ON DISABLE_TIFF=ON -G Ninja stitchEm
cmake -DGPU_BACKEND_CUDA=ON -DGPU_BACKEND_OPENCL=OFF -DDISABLE_RTMP=ON -G Ninja stitchEm
Any suggestions on how to get past this?
[CMakeError.log](https://github.com/stitchEm/stitchEm/files/4252419/CMakeError.log)
|
1.0
|
Jetson Nano cmake Error - I'm attempting to get StitchEm Studio running on a Nvidia Jetson Nano. I'm running into it complaining that several variables are used in the project but they are not set:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
FFMPEG_avcodec
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
FFMPEG_avformat
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
FFMPEG_avutil
linked by target "avPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/av
TIFF
linked by target "tiffPlugin" in directory /home/aspen/Documents/Development/stitchEm/IO/src/tiff
I've tried a couple different cmake with -DISABLE_AV=ON and with -DISABLE_TIFF=ON
cmake -DGPU_BACKEND_CUDA=ON -DGPU_BACKEND_OPENCL=OFF -DISABLE_AV=ON -DDISABLE_RTMP=ON DISABLE_TIFF=ON -G Ninja stitchEm
cmake -DGPU_BACKEND_CUDA=ON -DGPU_BACKEND_OPENCL=OFF -DDISABLE_RTMP=ON -G Ninja stitchEm
Any suggestions on how to get past this?
[CMakeError.log](https://github.com/stitchEm/stitchEm/files/4252419/CMakeError.log)
|
non_process
|
jetson nano cmake error i m attempting to get stitchem studio running on a nvidia jetson nano i m running into it complaining that several variables are used in the project but they are not set cmake error the following variables are used in this project but they are set to notfound please set them or make sure they are set and tested correctly in the cmake files ffmpeg avcodec linked by target avplugin in directory home aspen documents development stitchem io src av ffmpeg avformat linked by target avplugin in directory home aspen documents development stitchem io src av ffmpeg avutil linked by target avplugin in directory home aspen documents development stitchem io src av tiff linked by target tiffplugin in directory home aspen documents development stitchem io src tiff i ve tried a couple different cmake with disable av on and with disable tiff on cmake dgpu backend cuda on dgpu backend opencl off disable av on ddisable rtmp on disable tiff on g ninja stitchem cmake dgpu backend cuda on dgpu backend opencl off ddisable rtmp on g ninja stitchem any suggestions on how to get past this
| 0
|
4,062
| 6,994,135,421
|
IssuesEvent
|
2017-12-15 14:18:57
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
[Process] ContextErrorException | Array to string conversion
|
Bug Process Status: Needs Review Status: Waiting feedback
|
| Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | yes
| RFC? | no
| Symfony version | 3.4.2
Environnement variables passed to `proc_open()` in https://github.com/symfony/symfony/blob/v3.4.2/src/Symfony/Component/Process/Process.php#L334 throw a `ContextErrorException "Array to string conversion"`
The reason seems to be that env vars are now always inherited https://github.com/symfony/symfony/compare/v3.4.1...v3.4.2#diff-f9f2411040cda7b73402481facf3e4dd
but if a query string is present in the url then $env contains `argv` as an array
```php
[
"CONTENT_LENGTH" => "8610"
"CONTENT_TYPE" => "application/x-www-form-urlencoded"
"REQUEST_METHOD" => "POST"
"QUERY_STRING" => "uniqid=s5a339da72130a"
"argv" => [
0 => "uniqid=s5a339da72130a"
]
]
```
and apparently `proc_open()` does not expect a multi dimensionnal array for this parameter.
|
1.0
|
[Process] ContextErrorException | Array to string conversion - | Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | yes
| RFC? | no
| Symfony version | 3.4.2
Environnement variables passed to `proc_open()` in https://github.com/symfony/symfony/blob/v3.4.2/src/Symfony/Component/Process/Process.php#L334 throw a `ContextErrorException "Array to string conversion"`
The reason seems to be that env vars are now always inherited https://github.com/symfony/symfony/compare/v3.4.1...v3.4.2#diff-f9f2411040cda7b73402481facf3e4dd
but if a query string is present in the url then $env contains `argv` as an array
```php
[
"CONTENT_LENGTH" => "8610"
"CONTENT_TYPE" => "application/x-www-form-urlencoded"
"REQUEST_METHOD" => "POST"
"QUERY_STRING" => "uniqid=s5a339da72130a"
"argv" => [
0 => "uniqid=s5a339da72130a"
]
]
```
and apparently `proc_open()` does not expect a multi dimensionnal array for this parameter.
|
process
|
contexterrorexception array to string conversion q a bug report yes feature request no bc break report yes rfc no symfony version environnement variables passed to proc open in throw a contexterrorexception array to string conversion the reason seems to be that env vars are now always inherited but if a query string is present in the url then env contains argv as an array php content length content type application x www form urlencoded request method post query string uniqid argv uniqid and apparently proc open does not expect a multi dimensionnal array for this parameter
| 1
|
3,274
| 6,359,073,793
|
IssuesEvent
|
2017-07-31 05:41:04
|
dzhw/zofar
|
https://api.github.com/repos/dzhw/zofar
|
closed
|
create Wiki
|
category: service.processes prio: 1 status: testing type: backlog.task
|
Create the Wiki @ repository
- [x] Def. of Done
- [x] Sprint History
- [x] Refinement
|
1.0
|
create Wiki - Create the Wiki @ repository
- [x] Def. of Done
- [x] Sprint History
- [x] Refinement
|
process
|
create wiki create the wiki repository def of done sprint history refinement
| 1
|
284,575
| 8,743,983,870
|
IssuesEvent
|
2018-12-12 20:49:26
|
spacetelescope/jwql
|
https://api.github.com/repos/spacetelescope/jwql
|
closed
|
Provide some RESTful services/API in web app
|
Medium Priority Web Application enhancement
|
It would be useful to build in a RESTful API into our web application, so that users can easily get data programatically. It could even be useful to us developers when we are in need of static data, and could cut down on the amount of code that we need to write in `views.py` and `data_containers.py`.
#### A simple example: Get the list of proposals for a given instrument:
in `urls.py`:
```python
urlpatterns = [
...
path('api/<str:inst>/archive/proposals', views.proposals, name='proposals')
]
```
in `views.py`:
```python
from django.http import JsonResponse
def proposals(request, inst):
"""Return a list of proposals for the given instrument"""
return JsonResponse({'proposal_list': [10000, 100001, 100002]})
```
What is returned when visiting `http://127.0.0.1:8000/jwql/api/FGS/archive/proposals`:
```
{"proposal_list": [10000, 100001, 100002]}
```
To get the data programmatically:
```python
import json
import urllib.request
url = urllib.request.urlopen('http://127.0.0.1:8000/jwql/api/FGS/archive/proposals')
data = json.loads(url.read().decode())
print(data)
# {'proposal_list': [10000, 100001, 100002]}
```
|
1.0
|
Provide some RESTful services/API in web app - It would be useful to build in a RESTful API into our web application, so that users can easily get data programatically. It could even be useful to us developers when we are in need of static data, and could cut down on the amount of code that we need to write in `views.py` and `data_containers.py`.
#### A simple example: Get the list of proposals for a given instrument:
in `urls.py`:
```python
urlpatterns = [
...
path('api/<str:inst>/archive/proposals', views.proposals, name='proposals')
]
```
in `views.py`:
```python
from django.http import JsonResponse
def proposals(request, inst):
"""Return a list of proposals for the given instrument"""
return JsonResponse({'proposal_list': [10000, 100001, 100002]})
```
What is returned when visiting `http://127.0.0.1:8000/jwql/api/FGS/archive/proposals`:
```
{"proposal_list": [10000, 100001, 100002]}
```
To get the data programmatically:
```python
import json
import urllib.request
url = urllib.request.urlopen('http://127.0.0.1:8000/jwql/api/FGS/archive/proposals')
data = json.loads(url.read().decode())
print(data)
# {'proposal_list': [10000, 100001, 100002]}
```
|
non_process
|
provide some restful services api in web app it would be useful to build in a restful api into our web application so that users can easily get data programatically it could even be useful to us developers when we are in need of static data and could cut down on the amount of code that we need to write in views py and data containers py a simple example get the list of proposals for a given instrument in urls py python urlpatterns path api archive proposals views proposals name proposals in views py python from django http import jsonresponse def proposals request inst return a list of proposals for the given instrument return jsonresponse proposal list what is returned when visiting proposal list to get the data programmatically python import json import urllib request url urllib request urlopen data json loads url read decode print data proposal list
| 0
|
260,150
| 19,659,495,528
|
IssuesEvent
|
2022-01-10 15:41:19
|
Nautilus-Cyberneering/nautilus-librarian
|
https://api.github.com/repos/Nautilus-Cyberneering/nautilus-librarian
|
closed
|
Improve docs about running commands locally
|
documentation
|
For some complex commands like the "Gold Image Processing", it could be convenient to explain how you can run tests locally. We have an explanation on the docs:
https://nautilus-cyberneering.github.io/nautilus-librarian/development/#run-workflows-locally
but it's deprecated and besides it only explains how to run GitHub workflows. YOu can also execute directly the librarian command in a directory where you have previously set up the initial git/dvc repo at the state you want to test. I have explained that process here:
https://github.com/Nautilus-Cyberneering/nautilus-librarian/pull/56#issuecomment-1002991680
and I think we should add it to the documentation because it can be very useful for testing purposes. The key point is you can run the nautilus command using a different working directory, that way you can use your latest dev branch with a different git repo dir.
|
1.0
|
Improve docs about running commands locally - For some complex commands like the "Gold Image Processing", it could be convenient to explain how you can run tests locally. We have an explanation on the docs:
https://nautilus-cyberneering.github.io/nautilus-librarian/development/#run-workflows-locally
but it's deprecated and besides it only explains how to run GitHub workflows. YOu can also execute directly the librarian command in a directory where you have previously set up the initial git/dvc repo at the state you want to test. I have explained that process here:
https://github.com/Nautilus-Cyberneering/nautilus-librarian/pull/56#issuecomment-1002991680
and I think we should add it to the documentation because it can be very useful for testing purposes. The key point is you can run the nautilus command using a different working directory, that way you can use your latest dev branch with a different git repo dir.
|
non_process
|
improve docs about running commands locally for some complex commands like the gold image processing it could be convenient to explain how you can run tests locally we have an explanation on the docs but it s deprecated and besides it only explains how to run github workflows you can also execute directly the librarian command in a directory where you have previously set up the initial git dvc repo at the state you want to test i have explained that process here and i think we should add it to the documentation because it can be very useful for testing purposes the key point is you can run the nautilus command using a different working directory that way you can use your latest dev branch with a different git repo dir
| 0
|
20,549
| 27,210,425,419
|
IssuesEvent
|
2023-02-20 16:07:00
|
cse442-at-ub/project_s23-iweatherify
|
https://api.github.com/repos/cse442-at-ub/project_s23-iweatherify
|
closed
|
Creating PHP Documentation
|
Processing Task Sprint 1
|
**Task Tests**
*Test 1*
1) Open the Google Document called “PHP Documentation”
https://docs.google.com/document/d/1f2W6JGzp1QbLZvDrYjKrBlevV2UbyYYYefAmMZBztbA/edit
2) Read the “Getting Started with PHP” section
3) Read the “Recommended PHP Resources” section
4) Read the “Connecting to UB Servers” section
5) Complete the tutorial
|
1.0
|
Creating PHP Documentation - **Task Tests**
*Test 1*
1) Open the Google Document called “PHP Documentation”
https://docs.google.com/document/d/1f2W6JGzp1QbLZvDrYjKrBlevV2UbyYYYefAmMZBztbA/edit
2) Read the “Getting Started with PHP” section
3) Read the “Recommended PHP Resources” section
4) Read the “Connecting to UB Servers” section
5) Complete the tutorial
|
process
|
creating php documentation task tests test open the google document called “php documentation” read the “getting started with php” section read the “recommended php resources” section read the “connecting to ub servers” section complete the tutorial
| 1
|
77,564
| 14,883,743,415
|
IssuesEvent
|
2021-01-20 13:44:38
|
JosefPihrt/Roslynator
|
https://api.github.com/repos/JosefPihrt/Roslynator
|
closed
|
[Question] Enable refactorings in vscode (RRxxxx)
|
VS Code
|
**Product and Version Used**:
IDE: vscode latest
config: `.editorconfig` (not rulesets)
**Steps to Reproduce**:
I add the three [nuget packages](https://github.com/JosefPihrt/Roslynator#nuget-analyzers) to the project.
The RCS analyzers work automatically.
How do I enable the refactorings (RRxxxx)?
e.g. I tried `dotnet_diagnostic.RR0190.severity = warning` But that doesn't enable it.
|
1.0
|
[Question] Enable refactorings in vscode (RRxxxx) - **Product and Version Used**:
IDE: vscode latest
config: `.editorconfig` (not rulesets)
**Steps to Reproduce**:
I add the three [nuget packages](https://github.com/JosefPihrt/Roslynator#nuget-analyzers) to the project.
The RCS analyzers work automatically.
How do I enable the refactorings (RRxxxx)?
e.g. I tried `dotnet_diagnostic.RR0190.severity = warning` But that doesn't enable it.
|
non_process
|
enable refactorings in vscode rrxxxx product and version used ide vscode latest config editorconfig not rulesets steps to reproduce i add the three to the project the rcs analyzers work automatically how do i enable the refactorings rrxxxx e g i tried dotnet diagnostic severity warning but that doesn t enable it
| 0
|
794,228
| 28,027,015,132
|
IssuesEvent
|
2023-03-28 09:47:36
|
kubernetes/ingress-nginx
|
https://api.github.com/repos/kubernetes/ingress-nginx
|
closed
|
controller goes into CrashLoopBackOff if controller.publishService.pathOverride is invalid
|
needs-kind needs-triage needs-priority
|
**What happened**:
Deploying the controller with a invalid value for controller.publishService.pathOverride, results in the controller exiting immediately after starting. This happened due to a typo in the configuration.
Similar reports in https://github.com/kubernetes/ingress-nginx/issues/7770.
Logs:
```
I0327 08:11:21.736159 7 main.go:253] "Running in Kubernetes cluster" major="1" minor="25+" git="v1.25.6-eks-48e63af" state="clean" commit="9f22d4ae876173884749c0701f01340879ab3f95" platform="linux/amd64"
F0327 08:11:21.742783 7 main.go:92] No service with name test found in namespace nginx: services "test" not found
```
**What you expected to happen**:
Not sure if the controller should start and log the exception, and serve traffic as expected. If the controller should not be running with an invalid pathOverride configuration, that this issue can be closed as "not a bug".
**NGINX Ingress controller version** (exec into the pod and run nginx-ingress-controller --version.):
```
registry.k8s.io/ingress-nginx/controller:v1.6.4@sha256:15be4666c53052484dd2992efacf2f50ea77a78ae8aa21ccd91af6baaa7ea22f
```
**Kubernetes version** (use `kubectl version`):
```
v1.25.6-eks-48e63af
```
**Environment**:
- **How was the ingress-nginx-controller installed**:
- Kustomize
```
helmCharts:
- name: ingress-nginx
repo: https://kubernetes.github.io/ingress-nginx
version: 4.5.2
valuesFile: nginx.yaml
releaseName: alpha-public
```
nginx.yaml:
```
controller:
publishService:
enabled: true
pathOverride: nginx/test
service:
type: ClusterIP
enableHttp: false
```
- **Current State of the controller**:
- `kubectl describe ingressclasses`
- `kubectl -n <ingresscontrollernamespace> get all -A -o wide`
- `kubectl -n <ingresscontrollernamespace> describe po <ingresscontrollerpodname>`
- `kubectl -n <ingresscontrollernamespace> describe svc <ingresscontrollerservicename>`
- **Current state of ingress object, if applicable**:
- `kubectl -n <appnnamespace> get all,ing -o wide`
- `kubectl -n <appnamespace> describe ing <ingressname>`
- If applicable, then, your complete and exact curl/grpcurl command (redacted if required) and the reponse to the curl/grpcurl command with the -v flag
- **Others**:
- Any other related information like ;
- copy/paste of the snippet (if applicable)
- `kubectl describe ...` of any custom configmap(s) created and in use
- Any other related information that may help
**How to reproduce this issue**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or application.
Help up us (if possible) reproducing the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
## Install the ingress controller
kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/deploy/static/provider/baremetal/deploy.yaml
## Install an application that will act as default backend (is just an echo app)
kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/docs/examples/http-svc.yaml
## Create an ingress (please add any additional annotation required)
echo "
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: foo-bar
annotations:
kubernetes.io/ingress.class: nginx
spec:
ingressClassName: nginx # omit this if you're on controller version below 1.0.0
rules:
- host: foo.bar
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: http-svc
port:
number: 80
" | kubectl apply -f -
## make a request
POD_NAME=$(k get pods -n ingress-nginx -l app.kubernetes.io/name=ingress-nginx -o NAME)
kubectl exec -it -n ingress-nginx $POD_NAME -- curl -H 'Host: foo.bar' localhost
--->
**Anything else we need to know**:
This might not be a bug.
<!-- If this is actually about documentation, uncomment the following block -->
<!--
/kind documentation
/remove-kind bug
-->
|
1.0
|
controller goes into CrashLoopBackOff if controller.publishService.pathOverride is invalid - **What happened**:
Deploying the controller with a invalid value for controller.publishService.pathOverride, results in the controller exiting immediately after starting. This happened due to a typo in the configuration.
Similar reports in https://github.com/kubernetes/ingress-nginx/issues/7770.
Logs:
```
I0327 08:11:21.736159 7 main.go:253] "Running in Kubernetes cluster" major="1" minor="25+" git="v1.25.6-eks-48e63af" state="clean" commit="9f22d4ae876173884749c0701f01340879ab3f95" platform="linux/amd64"
F0327 08:11:21.742783 7 main.go:92] No service with name test found in namespace nginx: services "test" not found
```
**What you expected to happen**:
Not sure if the controller should start and log the exception, and serve traffic as expected. If the controller should not be running with an invalid pathOverride configuration, that this issue can be closed as "not a bug".
**NGINX Ingress controller version** (exec into the pod and run nginx-ingress-controller --version.):
```
registry.k8s.io/ingress-nginx/controller:v1.6.4@sha256:15be4666c53052484dd2992efacf2f50ea77a78ae8aa21ccd91af6baaa7ea22f
```
**Kubernetes version** (use `kubectl version`):
```
v1.25.6-eks-48e63af
```
**Environment**:
- **How was the ingress-nginx-controller installed**:
- Kustomize
```
helmCharts:
- name: ingress-nginx
repo: https://kubernetes.github.io/ingress-nginx
version: 4.5.2
valuesFile: nginx.yaml
releaseName: alpha-public
```
nginx.yaml:
```
controller:
publishService:
enabled: true
pathOverride: nginx/test
service:
type: ClusterIP
enableHttp: false
```
- **Current State of the controller**:
- `kubectl describe ingressclasses`
- `kubectl -n <ingresscontrollernamespace> get all -A -o wide`
- `kubectl -n <ingresscontrollernamespace> describe po <ingresscontrollerpodname>`
- `kubectl -n <ingresscontrollernamespace> describe svc <ingresscontrollerservicename>`
- **Current state of ingress object, if applicable**:
- `kubectl -n <appnnamespace> get all,ing -o wide`
- `kubectl -n <appnamespace> describe ing <ingressname>`
- If applicable, then, your complete and exact curl/grpcurl command (redacted if required) and the reponse to the curl/grpcurl command with the -v flag
- **Others**:
- Any other related information like ;
- copy/paste of the snippet (if applicable)
- `kubectl describe ...` of any custom configmap(s) created and in use
- Any other related information that may help
**How to reproduce this issue**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or application.
Help up us (if possible) reproducing the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
## Install the ingress controller
kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/deploy/static/provider/baremetal/deploy.yaml
## Install an application that will act as default backend (is just an echo app)
kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/docs/examples/http-svc.yaml
## Create an ingress (please add any additional annotation required)
echo "
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: foo-bar
annotations:
kubernetes.io/ingress.class: nginx
spec:
ingressClassName: nginx # omit this if you're on controller version below 1.0.0
rules:
- host: foo.bar
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: http-svc
port:
number: 80
" | kubectl apply -f -
## make a request
POD_NAME=$(k get pods -n ingress-nginx -l app.kubernetes.io/name=ingress-nginx -o NAME)
kubectl exec -it -n ingress-nginx $POD_NAME -- curl -H 'Host: foo.bar' localhost
--->
**Anything else we need to know**:
This might not be a bug.
<!-- If this is actually about documentation, uncomment the following block -->
<!--
/kind documentation
/remove-kind bug
-->
|
non_process
|
controller goes into crashloopbackoff if controller publishservice pathoverride is invalid what happened deploying the controller with a invalid value for controller publishservice pathoverride results in the controller exiting immediately after starting this happened due to a typo in the configuration similar reports in logs main go running in kubernetes cluster major minor git eks state clean commit platform linux main go no service with name test found in namespace nginx services test not found what you expected to happen not sure if the controller should start and log the exception and serve traffic as expected if the controller should not be running with an invalid pathoverride configuration that this issue can be closed as not a bug nginx ingress controller version exec into the pod and run nginx ingress controller version registry io ingress nginx controller kubernetes version use kubectl version eks environment how was the ingress nginx controller installed kustomize helmcharts name ingress nginx repo version valuesfile nginx yaml releasename alpha public nginx yaml controller publishservice enabled true pathoverride nginx test service type clusterip enablehttp false current state of the controller kubectl describe ingressclasses kubectl n get all a o wide kubectl n describe po kubectl n describe svc current state of ingress object if applicable kubectl n get all ing o wide kubectl n describe ing if applicable then your complete and exact curl grpcurl command redacted if required and the reponse to the curl grpcurl command with the v flag others any other related information like copy paste of the snippet if applicable kubectl describe of any custom configmap s created and in use any other related information that may help how to reproduce this issue as minimally and precisely as possible keep in mind we do not have access to your cluster or application help up us if possible reproducing the issue using minikube or kind install minikube kind minikube kind install the ingress controller kubectl apply f install an application that will act as default backend is just an echo app kubectl apply f create an ingress please add any additional annotation required echo apiversion networking io kind ingress metadata name foo bar annotations kubernetes io ingress class nginx spec ingressclassname nginx omit this if you re on controller version below rules host foo bar http paths path pathtype prefix backend service name http svc port number kubectl apply f make a request pod name k get pods n ingress nginx l app kubernetes io name ingress nginx o name kubectl exec it n ingress nginx pod name curl h host foo bar localhost anything else we need to know this might not be a bug kind documentation remove kind bug
| 0
|
460,977
| 13,221,599,214
|
IssuesEvent
|
2020-08-17 14:17:37
|
clowdr-app/clowdr-web-app
|
https://api.github.com/repos/clowdr-app/clowdr-web-app
|
opened
|
Add Lobby to list of text chats
|
Medium priority
|
We've eliminated almost all of the session-level text chats, but there are still a number of places where the Lobby chat shows, and I think this is something we want to keep. But it is then a bit confusing that people can't find it in the usual list of text chats.
Not critical for ICFP, if not easy.
|
1.0
|
Add Lobby to list of text chats - We've eliminated almost all of the session-level text chats, but there are still a number of places where the Lobby chat shows, and I think this is something we want to keep. But it is then a bit confusing that people can't find it in the usual list of text chats.
Not critical for ICFP, if not easy.
|
non_process
|
add lobby to list of text chats we ve eliminated almost all of the session level text chats but there are still a number of places where the lobby chat shows and i think this is something we want to keep but it is then a bit confusing that people can t find it in the usual list of text chats not critical for icfp if not easy
| 0
|
3,262
| 3,385,077,850
|
IssuesEvent
|
2015-11-27 09:25:22
|
coreos/rkt
|
https://api.github.com/repos/coreos/rkt
|
closed
|
stage0: Can't use paths with a + in them in volumes
|
area/usability dependency/appc spec kind/bug
|
```
sudo ./rkt run --mds-register=false --volume html,kind=host,source=/tmp/h+ml aci.gonyeo.com/nginx
...
Failed to stat /tmp/h ml: No such file or directory
```
|
True
|
stage0: Can't use paths with a + in them in volumes - ```
sudo ./rkt run --mds-register=false --volume html,kind=host,source=/tmp/h+ml aci.gonyeo.com/nginx
...
Failed to stat /tmp/h ml: No such file or directory
```
|
non_process
|
can t use paths with a in them in volumes sudo rkt run mds register false volume html kind host source tmp h ml aci gonyeo com nginx failed to stat tmp h ml no such file or directory
| 0
|
11,768
| 14,598,225,422
|
IssuesEvent
|
2020-12-20 23:51:58
|
ewen-lbh/portfolio
|
https://api.github.com/repos/ewen-lbh/portfolio
|
opened
|
Add subline for made with
|
processing
|
See prototype: there's an additional subline over the main tech name (ie _Adobe_ Photoshop). Implement that with a `Tag.Author` on the go struct.
|
1.0
|
Add subline for made with - See prototype: there's an additional subline over the main tech name (ie _Adobe_ Photoshop). Implement that with a `Tag.Author` on the go struct.
|
process
|
add subline for made with see prototype there s an additional subline over the main tech name ie adobe photoshop implement that with a tag author on the go struct
| 1
|
7,636
| 10,733,298,071
|
IssuesEvent
|
2019-10-29 00:49:04
|
AlmuraDev/SGCraft
|
https://api.github.com/repos/AlmuraDev/SGCraft
|
closed
|
Event Horizon isn't destroying blocks
|
bug in process
|
Blocks within the gate ring itself are not being destroyed.
SGCraft 2.0.2
|
1.0
|
Event Horizon isn't destroying blocks - Blocks within the gate ring itself are not being destroyed.
SGCraft 2.0.2
|
process
|
event horizon isn t destroying blocks blocks within the gate ring itself are not being destroyed sgcraft
| 1
|
450,770
| 13,019,014,878
|
IssuesEvent
|
2020-07-26 20:15:00
|
aaugustin/websockets
|
https://api.github.com/repos/aaugustin/websockets
|
closed
|
Add assertions to validate fail_connection requirements
|
enhancement low priority
|
In https://github.com/aaugustin/websockets/issues/465#issuecomment-434479748 I said:
> Don't call fail_connection() with the default code (1006) unless:
>
> a. you changed the connection state to CLOSING, typically by writing a Close frame, or
> b. you know that the connection is dead already, typically because you hit ConnectionError
`fail_connection` should enforce these requirements with assertions.
That would make it easier to diagnose such issues, which could happen in websockets itself.
|
1.0
|
Add assertions to validate fail_connection requirements - In https://github.com/aaugustin/websockets/issues/465#issuecomment-434479748 I said:
> Don't call fail_connection() with the default code (1006) unless:
>
> a. you changed the connection state to CLOSING, typically by writing a Close frame, or
> b. you know that the connection is dead already, typically because you hit ConnectionError
`fail_connection` should enforce these requirements with assertions.
That would make it easier to diagnose such issues, which could happen in websockets itself.
|
non_process
|
add assertions to validate fail connection requirements in i said don t call fail connection with the default code unless a you changed the connection state to closing typically by writing a close frame or b you know that the connection is dead already typically because you hit connectionerror fail connection should enforce these requirements with assertions that would make it easier to diagnose such issues which could happen in websockets itself
| 0
|
20,066
| 26,555,710,110
|
IssuesEvent
|
2023-01-20 11:49:59
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
closed
|
[C++] Nightly Integration Testing Report for Firestore
|
type: process nightly-testing
|
<hidden value="integration-test-status-comment"></hidden>
### [build against repo] Integration test with FLAKINESS (succeeded after retry)
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Thu Jan 19 04:03 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3957333603)**
| Failures | Configs |
|----------|---------|
| firestore | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Thu Jan 19 05:53 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3958351005)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against tip] Integration test succeeded!
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Fri Jan 20 03:46 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3967012693)**
|
1.0
|
[C++] Nightly Integration Testing Report for Firestore -
<hidden value="integration-test-status-comment"></hidden>
### [build against repo] Integration test with FLAKINESS (succeeded after retry)
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Thu Jan 19 04:03 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3957333603)**
| Failures | Configs |
|----------|---------|
| firestore | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Thu Jan 19 05:53 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3958351005)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against tip] Integration test succeeded!
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Fri Jan 20 03:46 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3967012693)**
|
process
|
nightly integration testing report for firestore integration test with flakiness succeeded after retry requested by on commit last updated thu jan pst failures configs firestore failed tests nbsp nbsp crash timeout add flaky tests to ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated thu jan pst ✅ nbsp integration test succeeded requested by on commit last updated fri jan pst
| 1
|
22,384
| 31,142,284,448
|
IssuesEvent
|
2023-08-16 01:44:07
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Flaky test: by deleting query member
|
stage: backlog process: flaky test topic: flake ❄️ priority: low stage: flake topic: net_stubbing.cy.ts stale
|
### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/38102/test-results/f7e0ef07-6232-450b-9b29-b9728b276a8f
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/net_stubbing.cy.ts#L1836
### Analysis
<img width="442" alt="Screen Shot 2022-08-17 at 9 17 09 PM" src="https://user-images.githubusercontent.com/26726429/185292549-d8a80538-f081-4023-8cde-506204b539cf.png">
### Cypress Version
10.6.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
1.0
|
Flaky test: by deleting query member - ### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/38102/test-results/f7e0ef07-6232-450b-9b29-b9728b276a8f
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/net_stubbing.cy.ts#L1836
### Analysis
<img width="442" alt="Screen Shot 2022-08-17 at 9 17 09 PM" src="https://user-images.githubusercontent.com/26726429/185292549-d8a80538-f081-4023-8cde-506204b539cf.png">
### Cypress Version
10.6.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
process
|
flaky test by deleting query member link to dashboard or circleci failure link to failing test in github analysis img width alt screen shot at pm src cypress version other search for this issue number in the codebase to find the test s skipped until this issue is fixed
| 1
|
21,922
| 30,446,459,805
|
IssuesEvent
|
2023-07-15 18:31:22
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pyutils 0.0.1b5 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b5",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils-0.0.1b5/src/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpyzptzopu/pyutils"
}
}```
|
1.0
|
pyutils 0.0.1b5 has 2 GuardDog issues - https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b5",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils-0.0.1b5/src/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpyzptzopu/pyutils"
}
}```
|
process
|
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pytils python utils silent process execution location pyutils src pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmpyzptzopu pyutils
| 1
|
11,082
| 13,923,922,100
|
IssuesEvent
|
2020-10-21 14:57:56
|
timberio/vector
|
https://api.github.com/repos/timberio/vector
|
closed
|
New `split` remap function
|
domain: processing domain: remap type: feature
|
The `split` remap function splits a string with the provided pattern
## Example
Given this event for all examples:
```js
{
"message": "This is a long string"
}
```
### Default
By default, the `split` function splits on all occurences of the passed pattern:
```
.message = split(.message, / /)
```
Would result in:
```js
{
"message": ["This", "is", "a", "long", "string"]
}
```
### Limit
Users can optionally supply a limit, to limit the number of items:
```
.message = split(.message, / /, 2)
```
Would result in:
```js
{
"message": ["This", "is a long string"]
}
```
|
1.0
|
New `split` remap function - The `split` remap function splits a string with the provided pattern
## Example
Given this event for all examples:
```js
{
"message": "This is a long string"
}
```
### Default
By default, the `split` function splits on all occurences of the passed pattern:
```
.message = split(.message, / /)
```
Would result in:
```js
{
"message": ["This", "is", "a", "long", "string"]
}
```
### Limit
Users can optionally supply a limit, to limit the number of items:
```
.message = split(.message, / /, 2)
```
Would result in:
```js
{
"message": ["This", "is a long string"]
}
```
|
process
|
new split remap function the split remap function splits a string with the provided pattern example given this event for all examples js message this is a long string default by default the split function splits on all occurences of the passed pattern message split message would result in js message limit users can optionally supply a limit to limit the number of items message split message would result in js message
| 1
|
10,855
| 13,630,314,025
|
IssuesEvent
|
2020-09-24 16:15:51
|
eddieantonio/predictive-text-studio
|
https://api.github.com/repos/eddieantonio/predictive-text-studio
|
opened
|
Update to actual kmp.json format 😫
|
bug data-backing data-processing enhancement worker
|
I was wrong about the format of a `kmp.json` file. It actually follows this spec:
Documentation: https://help.keyman.com/developer/11.0/reference/file-types/metadata
(Outdated) Documentation: https://github.com/keymanapp/keyman/wiki/KMP-Metadata-File-(kmp.inf-and-kmp.json)#fields
JSON Schema: https://api.keyman.com/schemas/package.json
Here are some example `kmp.json` files to get you started:
This one is from https://downloads.keyman.com/models/nrc.en.mtnt/0.1.4/nrc.en.mtnt.model.kmp:
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:easantos@ualberta.ca"
},
"copyright": {
"description": "© 2019 National Research Council Canada"
},
"name": {
"description": "English language model mined from MTNT"
},
"version": {
"description": "0.1.4"
}
},
"files": [
{
"name": "nrc.en.mtnt.model.js",
"description": "Lexical model nrc.en.mtnt.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "English dictionary (MTNT)",
"id": "nrc.en.mtnt",
"languages": [
{
"name": "English",
"id": "en"
},
{
"name": "English (US)",
"id": "en-us"
},
{
"name": "English (Canada)",
"id": "en-ca"
}
]
}
]
}
```
This one is from https://downloads.keyman.com/models/nrc.str.sencoten/1.0.5/nrc.str.sencoten.model.kmp
```
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 Timothy Montler and the W̱SÁNEĆ School Board"
},
"name": {
"description": "SENĆOŦEN (Saanich Dialect) Lexical Model"
},
"version": {
"description": "1.0.5"
}
},
"files": [
{
"name": "nrc.str.sencoten.model.js",
"description": "Lexical model nrc.str.sencoten.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "SENĆOŦEN dictionary",
"id": "nrc.str.sencoten",
"languages": [
{
"name": "North Straits Salish",
"id": "str"
},
{
"name": "SENĆOŦEN",
"id": "str-Latn"
}
]
}
]
}
```
---
**NOTE**: this does _not_ invalidate #72 or #51! Those are still useful. The `.model_info` files are still used for **public** distribution of Keyman lexical models. I mistakenly assumed that the `kmp.json` and `.model_info` files were the same 😩
|
1.0
|
Update to actual kmp.json format 😫 - I was wrong about the format of a `kmp.json` file. It actually follows this spec:
Documentation: https://help.keyman.com/developer/11.0/reference/file-types/metadata
(Outdated) Documentation: https://github.com/keymanapp/keyman/wiki/KMP-Metadata-File-(kmp.inf-and-kmp.json)#fields
JSON Schema: https://api.keyman.com/schemas/package.json
Here are some example `kmp.json` files to get you started:
This one is from https://downloads.keyman.com/models/nrc.en.mtnt/0.1.4/nrc.en.mtnt.model.kmp:
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:easantos@ualberta.ca"
},
"copyright": {
"description": "© 2019 National Research Council Canada"
},
"name": {
"description": "English language model mined from MTNT"
},
"version": {
"description": "0.1.4"
}
},
"files": [
{
"name": "nrc.en.mtnt.model.js",
"description": "Lexical model nrc.en.mtnt.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "English dictionary (MTNT)",
"id": "nrc.en.mtnt",
"languages": [
{
"name": "English",
"id": "en"
},
{
"name": "English (US)",
"id": "en-us"
},
{
"name": "English (Canada)",
"id": "en-ca"
}
]
}
]
}
```
This one is from https://downloads.keyman.com/models/nrc.str.sencoten/1.0.5/nrc.str.sencoten.model.kmp
```
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 Timothy Montler and the W̱SÁNEĆ School Board"
},
"name": {
"description": "SENĆOŦEN (Saanich Dialect) Lexical Model"
},
"version": {
"description": "1.0.5"
}
},
"files": [
{
"name": "nrc.str.sencoten.model.js",
"description": "Lexical model nrc.str.sencoten.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "SENĆOŦEN dictionary",
"id": "nrc.str.sencoten",
"languages": [
{
"name": "North Straits Salish",
"id": "str"
},
{
"name": "SENĆOŦEN",
"id": "str-Latn"
}
]
}
]
}
```
---
**NOTE**: this does _not_ invalidate #72 or #51! Those are still useful. The `.model_info` files are still used for **public** distribution of Keyman lexical models. I mistakenly assumed that the `kmp.json` and `.model_info` files were the same 😩
|
process
|
update to actual kmp json format 😫 i was wrong about the format of a kmp json file it actually follows this spec documentation outdated documentation json schema here are some example kmp json files to get you started this one is from json system keymandeveloperversion fileversion options followkeyboardversion false info author description eddie antonio santos url mailto easantos ualberta ca copyright description © national research council canada name description english language model mined from mtnt version description files name nrc en mtnt model js description lexical model nrc en mtnt model js copylocation filetype model js lexicalmodels name english dictionary mtnt id nrc en mtnt languages name english id en name english us id en us name english canada id en ca this one is from system keymandeveloperversion fileversion options followkeyboardversion false info author description eddie antonio santos url mailto eddie santos nrc cnrc gc ca copyright description © timothy montler and the w̱sáneć school board name description senćoŧen saanich dialect lexical model version description files name nrc str sencoten model js description lexical model nrc str sencoten model js copylocation filetype model js lexicalmodels name senćoŧen dictionary id nrc str sencoten languages name north straits salish id str name senćoŧen id str latn note this does not invalidate or those are still useful the model info files are still used for public distribution of keyman lexical models i mistakenly assumed that the kmp json and model info files were the same 😩
| 1
|
1,839
| 4,646,807,975
|
IssuesEvent
|
2016-10-01 04:07:27
|
fabiocav/fabio-random
|
https://api.github.com/repos/fabiocav/fabio-random
|
closed
|
test
|
processed-by-function
|
### Summary
### Expected behavior
### Actual behavior
### Steps to reproduce
#### function.json contents
`json
If applicable, paste your function.json contents here
`
####function code
`
If applicable, paste your function code here
`
|
1.0
|
test - ### Summary
### Expected behavior
### Actual behavior
### Steps to reproduce
#### function.json contents
`json
If applicable, paste your function.json contents here
`
####function code
`
If applicable, paste your function code here
`
|
process
|
test summary expected behavior actual behavior steps to reproduce function json contents json if applicable paste your function json contents here function code if applicable paste your function code here
| 1
|
2,261
| 5,093,456,443
|
IssuesEvent
|
2017-01-03 06:11:38
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Rewrite duplicate copy-to attributes with force-unique
|
bug P2 preprocess
|
In DITA-OT 2.4 `force-unique` will not rewrite `@copy-to` attributes when the same `@href` appears multiple times in a map. This should be fixed so that if multiple `@copy-to` contain the same value, all but the first should be rewritten.
|
1.0
|
Rewrite duplicate copy-to attributes with force-unique - In DITA-OT 2.4 `force-unique` will not rewrite `@copy-to` attributes when the same `@href` appears multiple times in a map. This should be fixed so that if multiple `@copy-to` contain the same value, all but the first should be rewritten.
|
process
|
rewrite duplicate copy to attributes with force unique in dita ot force unique will not rewrite copy to attributes when the same href appears multiple times in a map this should be fixed so that if multiple copy to contain the same value all but the first should be rewritten
| 1
|
418
| 2,587,890,357
|
IssuesEvent
|
2015-02-17 21:19:03
|
spyder-ide/spyder
|
https://api.github.com/repos/spyder-ide/spyder
|
opened
|
Freezes when dealing with netcdf files
|
1 star bug imported Usability Variable-Explorer
|
_From [zzhen...@gmail.com](https://code.google.com/u/102660079536155033095/) on 2014-09-27T02:23:12Z_
I have a variable read from a netCDF file, its shape is "**.shape --> (77,)", so this is not a large data.
But when I want to print it out, the program just stay still. Why does this happen?
_Original issue: http://code.google.com/p/spyderlib/issues/detail?id=1988_
|
True
|
Freezes when dealing with netcdf files - _From [zzhen...@gmail.com](https://code.google.com/u/102660079536155033095/) on 2014-09-27T02:23:12Z_
I have a variable read from a netCDF file, its shape is "**.shape --> (77,)", so this is not a large data.
But when I want to print it out, the program just stay still. Why does this happen?
_Original issue: http://code.google.com/p/spyderlib/issues/detail?id=1988_
|
non_process
|
freezes when dealing with netcdf files from on i have a variable read from a netcdf file its shape is shape so this is not a large data but when i want to print it out the program just stay still why does this happen original issue
| 0
|
4,922
| 7,795,069,282
|
IssuesEvent
|
2018-06-08 06:38:57
|
StrikeNP/trac_test
|
https://api.github.com/repos/StrikeNP/trac_test
|
closed
|
Add date to page title (Trac #108)
|
Migrated from Trac enhancement post_processing senkbeil@uwm.edu
|
I think it would be nice if the date the plots were generated was added to the page title. This would make it easier to remember which set of plots is from which day.
Attachments:
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/108
```json
{
"status": "closed",
"changetime": "2009-09-02T20:38:45",
"description": "I think it would be nice if the date the plots were generated was added to the page title. This would make it easier to remember which set of plots is from which day.",
"reporter": "nielsenb@uwm.edu",
"cc": "senkbeil@uwm.edu",
"resolution": "Verified by V. Larson",
"_ts": "1251923925000000",
"component": "post_processing",
"summary": "Add date to page title",
"priority": "minor",
"keywords": "plotgen",
"time": "2009-06-30T13:59:26",
"milestone": "Plotgen 3.0",
"owner": "senkbeil@uwm.edu",
"type": "enhancement"
}
```
|
1.0
|
Add date to page title (Trac #108) - I think it would be nice if the date the plots were generated was added to the page title. This would make it easier to remember which set of plots is from which day.
Attachments:
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/108
```json
{
"status": "closed",
"changetime": "2009-09-02T20:38:45",
"description": "I think it would be nice if the date the plots were generated was added to the page title. This would make it easier to remember which set of plots is from which day.",
"reporter": "nielsenb@uwm.edu",
"cc": "senkbeil@uwm.edu",
"resolution": "Verified by V. Larson",
"_ts": "1251923925000000",
"component": "post_processing",
"summary": "Add date to page title",
"priority": "minor",
"keywords": "plotgen",
"time": "2009-06-30T13:59:26",
"milestone": "Plotgen 3.0",
"owner": "senkbeil@uwm.edu",
"type": "enhancement"
}
```
|
process
|
add date to page title trac i think it would be nice if the date the plots were generated was added to the page title this would make it easier to remember which set of plots is from which day attachments migrated from json status closed changetime description i think it would be nice if the date the plots were generated was added to the page title this would make it easier to remember which set of plots is from which day reporter nielsenb uwm edu cc senkbeil uwm edu resolution verified by v larson ts component post processing summary add date to page title priority minor keywords plotgen time milestone plotgen owner senkbeil uwm edu type enhancement
| 1
|
5,756
| 8,408,490,610
|
IssuesEvent
|
2018-10-12 01:57:26
|
LooseScrews5237/ftc_app
|
https://api.github.com/repos/LooseScrews5237/ftc_app
|
closed
|
Half Speed Button
|
requirement
|
The drive team has requested that we provide a button that can reduce the robot speed by 50%. The button we will map this to is right trigger. When the trigger is released, the robot should go back to full speed.
|
1.0
|
Half Speed Button - The drive team has requested that we provide a button that can reduce the robot speed by 50%. The button we will map this to is right trigger. When the trigger is released, the robot should go back to full speed.
|
non_process
|
half speed button the drive team has requested that we provide a button that can reduce the robot speed by the button we will map this to is right trigger when the trigger is released the robot should go back to full speed
| 0
|
238,474
| 19,722,886,897
|
IssuesEvent
|
2022-01-13 16:58:26
|
vectordotdev/vector
|
https://api.github.com/repos/vectordotdev/vector
|
closed
|
Upgrade `splunk_hec` integration test image
|
sink: splunk_hec domain: tests type: task
|
Currently, integration tests for `splunk_hec` sinks [use an image](https://hub.docker.com/r/timberio/splunk-hec-test) with `Splunk v7.3.2`. [Splunk has deprecated the version](https://docs.splunk.com/Documentation/Splunk/7.3.2/Data/UsetheHTTPEventCollector) as of October 22, 2021.
While there don't appear to be major differences between `v7.3.2` and `latest (8.2.3)`, it would be useful to update the image.
- One difference of note is a new format for submitting metric events.
|
1.0
|
Upgrade `splunk_hec` integration test image - Currently, integration tests for `splunk_hec` sinks [use an image](https://hub.docker.com/r/timberio/splunk-hec-test) with `Splunk v7.3.2`. [Splunk has deprecated the version](https://docs.splunk.com/Documentation/Splunk/7.3.2/Data/UsetheHTTPEventCollector) as of October 22, 2021.
While there don't appear to be major differences between `v7.3.2` and `latest (8.2.3)`, it would be useful to update the image.
- One difference of note is a new format for submitting metric events.
|
non_process
|
upgrade splunk hec integration test image currently integration tests for splunk hec sinks with splunk as of october while there don t appear to be major differences between and latest it would be useful to update the image one difference of note is a new format for submitting metric events
| 0
|
133,396
| 10,820,407,327
|
IssuesEvent
|
2019-11-08 16:19:38
|
ICIJ/datashare
|
https://api.github.com/repos/ICIJ/datashare
|
closed
|
Batchsearch results after filtering: pages 2 and more are displayed even though they are empty
|
bug front need testing
|
When there are 100 or less results, we should not display the per page navigation bar.
<img width="702" alt="Screenshot 2019-10-31 at 13 40 04" src="https://user-images.githubusercontent.com/17233829/67948272-affa2680-fbe5-11e9-82cc-cddac8bcc44d.png">
To reproduce : it happens after filtering queries only.
|
1.0
|
Batchsearch results after filtering: pages 2 and more are displayed even though they are empty - When there are 100 or less results, we should not display the per page navigation bar.
<img width="702" alt="Screenshot 2019-10-31 at 13 40 04" src="https://user-images.githubusercontent.com/17233829/67948272-affa2680-fbe5-11e9-82cc-cddac8bcc44d.png">
To reproduce : it happens after filtering queries only.
|
non_process
|
batchsearch results after filtering pages and more are displayed even though they are empty when there are or less results we should not display the per page navigation bar img width alt screenshot at src to reproduce it happens after filtering queries only
| 0
|
260,730
| 27,784,706,508
|
IssuesEvent
|
2023-03-17 01:30:24
|
n-devs/NodeJSControUI
|
https://api.github.com/repos/n-devs/NodeJSControUI
|
closed
|
CVE-2019-1010266 (Medium) detected in lodash-4.17.5.tgz - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2019-1010266 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p>
<p>Path to dependency file: /NodeJSControUI/package.json</p>
<p>Path to vulnerable library: /node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jsdom-11.6.2.tgz (Root Library)
- request-promise-native-1.0.5.tgz
- request-promise-core-1.1.1.tgz
- :x: **lodash-4.17.5.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.
<p>Publish Date: 2019-07-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-1010266>CVE-2019-1010266</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p>
<p>Release Date: 2019-07-17</p>
<p>Fix Resolution (lodash): 4.17.11</p>
<p>Direct dependency fix Resolution (jsdom): 11.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-1010266 (Medium) detected in lodash-4.17.5.tgz - autoclosed - ## CVE-2019-1010266 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p>
<p>Path to dependency file: /NodeJSControUI/package.json</p>
<p>Path to vulnerable library: /node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jsdom-11.6.2.tgz (Root Library)
- request-promise-native-1.0.5.tgz
- request-promise-core-1.1.1.tgz
- :x: **lodash-4.17.5.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.
<p>Publish Date: 2019-07-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-1010266>CVE-2019-1010266</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p>
<p>Release Date: 2019-07-17</p>
<p>Fix Resolution (lodash): 4.17.11</p>
<p>Direct dependency fix Resolution (jsdom): 11.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in lodash tgz autoclosed cve medium severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file nodejscontroui package json path to vulnerable library node modules lodash package json dependency hierarchy jsdom tgz root library request promise native tgz request promise core tgz x lodash tgz vulnerable library vulnerability details lodash prior to is affected by cwe uncontrolled resource consumption the impact is denial of service the component is date handler the attack vector is attacker provides very long strings which the library attempts to match using a regular expression the fixed version is publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash direct dependency fix resolution jsdom step up your open source security game with mend
| 0
|
16,132
| 20,381,942,383
|
IssuesEvent
|
2022-02-21 23:27:54
|
fluent/fluent-bit
|
https://api.github.com/repos/fluent/fluent-bit
|
closed
|
Add logo to list of companies using fluent-bit
|
work-in-process
|
We use fluent-bit in our company to enable data collection for a variety of platforms. We would like our company logo listed. As suggested in the README, we are raising a request for the same. Thanks in advance and kudos on such a useful tool that should be in every enterprise arsenal!

|
1.0
|
Add logo to list of companies using fluent-bit - We use fluent-bit in our company to enable data collection for a variety of platforms. We would like our company logo listed. As suggested in the README, we are raising a request for the same. Thanks in advance and kudos on such a useful tool that should be in every enterprise arsenal!

|
process
|
add logo to list of companies using fluent bit we use fluent bit in our company to enable data collection for a variety of platforms we would like our company logo listed as suggested in the readme we are raising a request for the same thanks in advance and kudos on such a useful tool that should be in every enterprise arsenal
| 1
|
112,751
| 4,537,479,314
|
IssuesEvent
|
2016-09-09 00:34:47
|
angular/angular-cli
|
https://api.github.com/repos/angular/angular-cli
|
closed
|
barrel not created in manually created folder after creating services in it
|
effort: (1) easy priority: 1 (urgent) type: bug
|
*SUGGESTION* when I run these commands, no barrel is created in the folder. I think it should be.
```
mkdir speaker-services && cd speaker-services
ng g service svc1
ng g service svc2
```
HANS - agreed. That's a bug.
cc @hansl @Brocco
|
1.0
|
barrel not created in manually created folder after creating services in it - *SUGGESTION* when I run these commands, no barrel is created in the folder. I think it should be.
```
mkdir speaker-services && cd speaker-services
ng g service svc1
ng g service svc2
```
HANS - agreed. That's a bug.
cc @hansl @Brocco
|
non_process
|
barrel not created in manually created folder after creating services in it suggestion when i run these commands no barrel is created in the folder i think it should be mkdir speaker services cd speaker services ng g service ng g service hans agreed that s a bug cc hansl brocco
| 0
|
700,294
| 24,055,016,177
|
IssuesEvent
|
2022-09-16 15:59:34
|
b-venter/ShopAppFront
|
https://api.github.com/repos/b-venter/ShopAppFront
|
closed
|
Item auto filter details
|
enhancement priority
|
When adding an item (or shop) to the shopping list, provide extra details like brand.
|
1.0
|
Item auto filter details - When adding an item (or shop) to the shopping list, provide extra details like brand.
|
non_process
|
item auto filter details when adding an item or shop to the shopping list provide extra details like brand
| 0
|
20,283
| 29,507,363,110
|
IssuesEvent
|
2023-06-03 13:30:02
|
foamzou/ITraffic-monitor-for-mac
|
https://api.github.com/repos/foamzou/ITraffic-monitor-for-mac
|
closed
|
display panel's position has offset
|
compatible
|
<img width="463" alt="image" src="https://user-images.githubusercontent.com/11816720/215300093-57d1f154-6347-40a6-a880-0e2f1ef6f2d4.png">
<img width="526" alt="image" src="https://user-images.githubusercontent.com/11816720/215300198-deb84867-14c9-4f26-b798-eb69ba79b12b.png">
|
True
|
display panel's position has offset - <img width="463" alt="image" src="https://user-images.githubusercontent.com/11816720/215300093-57d1f154-6347-40a6-a880-0e2f1ef6f2d4.png">
<img width="526" alt="image" src="https://user-images.githubusercontent.com/11816720/215300198-deb84867-14c9-4f26-b798-eb69ba79b12b.png">
|
non_process
|
display panel s position has offset img width alt image src img width alt image src
| 0
|
39,967
| 5,258,526,868
|
IssuesEvent
|
2017-02-02 23:40:30
|
axemclion/IndexedDBShim
|
https://api.github.com/repos/axemclion/IndexedDBShim
|
opened
|
Interface testing issue: indexeddb expected as a getter
|
Bug-missing API Testing
|
Just reporting a minor issue encountered in testing.
Two of the W3C IndexedDB tests relate to interfaces (in the main thread or workers).
One of the expectations in the main thread is apparently that for this:
```webidl
partial interface WindowOrWorkerGlobalScope {
[SameObject] readonly attribute IDBFactory indexedDB;
};
```
...the `indexedDB` property is expected to be implemented as a getter (though for some reason, the testing does not insist that this be the case for the WorkerGlobalScope).
However, there is a bug in [Node](https://github.com/nodejs/node/issues/2734)/[jsdom](https://github.com/tmpvar/jsdom/issues/1720) which converts accessor property descriptors (as with a getter) into data descriptors when within a vm (where we execute the originally-HTML W3C tests to run in a sandbox and thus somewhat more safely).
And thus, we get a complaint from the main thread interface test that `"indexedDB" must have a getter expected "function"` (and switching to `value`/`writable` with its lack of a getter gets the same complaint).
A small issue of presumably little consequence except for tests but one which will hopefully be fixed (especially since we are otherwise already passing all of the other many interface tests).
|
1.0
|
Interface testing issue: indexeddb expected as a getter - Just reporting a minor issue encountered in testing.
Two of the W3C IndexedDB tests relate to interfaces (in the main thread or workers).
One of the expectations in the main thread is apparently that for this:
```webidl
partial interface WindowOrWorkerGlobalScope {
[SameObject] readonly attribute IDBFactory indexedDB;
};
```
...the `indexedDB` property is expected to be implemented as a getter (though for some reason, the testing does not insist that this be the case for the WorkerGlobalScope).
However, there is a bug in [Node](https://github.com/nodejs/node/issues/2734)/[jsdom](https://github.com/tmpvar/jsdom/issues/1720) which converts accessor property descriptors (as with a getter) into data descriptors when within a vm (where we execute the originally-HTML W3C tests to run in a sandbox and thus somewhat more safely).
And thus, we get a complaint from the main thread interface test that `"indexedDB" must have a getter expected "function"` (and switching to `value`/`writable` with its lack of a getter gets the same complaint).
A small issue of presumably little consequence except for tests but one which will hopefully be fixed (especially since we are otherwise already passing all of the other many interface tests).
|
non_process
|
interface testing issue indexeddb expected as a getter just reporting a minor issue encountered in testing two of the indexeddb tests relate to interfaces in the main thread or workers one of the expectations in the main thread is apparently that for this webidl partial interface windoworworkerglobalscope readonly attribute idbfactory indexeddb the indexeddb property is expected to be implemented as a getter though for some reason the testing does not insist that this be the case for the workerglobalscope however there is a bug in which converts accessor property descriptors as with a getter into data descriptors when within a vm where we execute the originally html tests to run in a sandbox and thus somewhat more safely and thus we get a complaint from the main thread interface test that indexeddb must have a getter expected function and switching to value writable with its lack of a getter gets the same complaint a small issue of presumably little consequence except for tests but one which will hopefully be fixed especially since we are otherwise already passing all of the other many interface tests
| 0
|
257,163
| 19,489,436,196
|
IssuesEvent
|
2021-12-27 01:47:24
|
VeryBuy/react-native-tappay
|
https://api.github.com/repos/VeryBuy/react-native-tappay
|
closed
|
update README
|
documentation
|
- [x] badges
- [x] description
- [x] Support payment type
- [x] tappay sdk version
- [x] How to run Example
- [x] License
|
1.0
|
update README - - [x] badges
- [x] description
- [x] Support payment type
- [x] tappay sdk version
- [x] How to run Example
- [x] License
|
non_process
|
update readme badges description support payment type tappay sdk version how to run example license
| 0
|
17,878
| 23,830,227,673
|
IssuesEvent
|
2022-09-05 19:38:49
|
GoogleCloudPlatform/python-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
|
closed
|
Found a possible security concern
|
type: process samples
|
Hello Team!
Previously I detailed a security bug in your code on 22nd August. Google bug hunter team acknowledged the case, I thought that might be the right place for reporting code-level security issues. So, my new findings also detailed recently on [g.co/vulnz](https://g.co/vulnz). (Regarding your policy). But after following the policy I get baffled by their reply.
[](https://postimg.cc/G9YxSkQj)
I figure the new report was dealt with by someone not mindful of the policy and my past report was handled fine (acknowledged). In case I'm right, please make sure the GoogleVRP will accept code issues of this repository.
Regards,
|
1.0
|
Found a possible security concern - Hello Team!
Previously I detailed a security bug in your code on 22nd August. Google bug hunter team acknowledged the case, I thought that might be the right place for reporting code-level security issues. So, my new findings also detailed recently on [g.co/vulnz](https://g.co/vulnz). (Regarding your policy). But after following the policy I get baffled by their reply.
[](https://postimg.cc/G9YxSkQj)
I figure the new report was dealt with by someone not mindful of the policy and my past report was handled fine (acknowledged). In case I'm right, please make sure the GoogleVRP will accept code issues of this repository.
Regards,
|
process
|
found a possible security concern hello team previously i detailed a security bug in your code on august google bug hunter team acknowledged the case i thought that might be the right place for reporting code level security issues so my new findings also detailed recently on regarding your policy but after following the policy i get baffled by their reply i figure the new report was dealt with by someone not mindful of the policy and my past report was handled fine acknowledged in case i m right please make sure the googlevrp will accept code issues of this repository regards
| 1
|
73,053
| 31,844,262,023
|
IssuesEvent
|
2023-09-14 18:38:02
|
AmplicaLabs/content-publishing-service
|
https://api.github.com/repos/AmplicaLabs/content-publishing-service
|
opened
|
Copy of Setup end user docker and publish
|
content-publishing-service
|
Docker compose consist of redis, kubo, 1 api, 1 worker, chain
|
1.0
|
Copy of Setup end user docker and publish - Docker compose consist of redis, kubo, 1 api, 1 worker, chain
|
non_process
|
copy of setup end user docker and publish docker compose consist of redis kubo api worker chain
| 0
|
20,758
| 27,491,942,004
|
IssuesEvent
|
2023-03-04 18:26:10
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
In retouch the brush source is not fully copied to target
|
reproduce: confirmed scope: UI scope: image processing
|
**Describe the bug/issue**
The brush source is not fully copied to target. The retouch module only starts to copy from where the source start is projected to the target. See the picture below.


**To Reproduce**
1. Open an image in darkroom
2. Open retouch module
3. Draw a brush line
4. Move the source so that the start of the source is clearly projected below the start of the target
**Expected behavior**
The source is copied to the target at full length
**Platform**
* darktable version : 4.1.0+236~g78747838c
* OS : Win10
* Memory : 32GB
* Graphics card : NVidia T500
* OpenCL installed : yes
* OpenCL activated : the bus is present both OpenCL activated and inactive
**Additional context**
- Can you reproduce with another d
- arktable version(s)? **yes with version 4.0, and with 4.1 on Linux**
- Are the steps above reproducible with a fresh edit (i.e. after discarding history)? **yes/no**
- If the issue is with the output image, attach an XMP file if (you'll have to change the extension to `.txt`)
- Is the issue still present using an empty/new config-dir (e.g. start darktable with --configdir "/tmp")? **yes**
[P6221669.zip](https://github.com/darktable-org/darktable/files/9449947/P6221669.zip)
|
1.0
|
In retouch the brush source is not fully copied to target - **Describe the bug/issue**
The brush source is not fully copied to target. The retouch module only starts to copy from where the source start is projected to the target. See the picture below.


**To Reproduce**
1. Open an image in darkroom
2. Open retouch module
3. Draw a brush line
4. Move the source so that the start of the source is clearly projected below the start of the target
**Expected behavior**
The source is copied to the target at full length
**Platform**
* darktable version : 4.1.0+236~g78747838c
* OS : Win10
* Memory : 32GB
* Graphics card : NVidia T500
* OpenCL installed : yes
* OpenCL activated : the bus is present both OpenCL activated and inactive
**Additional context**
- Can you reproduce with another d
- arktable version(s)? **yes with version 4.0, and with 4.1 on Linux**
- Are the steps above reproducible with a fresh edit (i.e. after discarding history)? **yes/no**
- If the issue is with the output image, attach an XMP file if (you'll have to change the extension to `.txt`)
- Is the issue still present using an empty/new config-dir (e.g. start darktable with --configdir "/tmp")? **yes**
[P6221669.zip](https://github.com/darktable-org/darktable/files/9449947/P6221669.zip)
|
process
|
in retouch the brush source is not fully copied to target describe the bug issue the brush source is not fully copied to target the retouch module only starts to copy from where the source start is projected to the target see the picture below to reproduce open an image in darkroom open retouch module draw a brush line move the source so that the start of the source is clearly projected below the start of the target expected behavior the source is copied to the target at full length platform darktable version os memory graphics card nvidia opencl installed yes opencl activated the bus is present both opencl activated and inactive additional context can you reproduce with another d arktable version s yes with version and with on linux are the steps above reproducible with a fresh edit i e after discarding history yes no if the issue is with the output image attach an xmp file if you ll have to change the extension to txt is the issue still present using an empty new config dir e g start darktable with configdir tmp yes
| 1
|
204,988
| 7,093,372,340
|
IssuesEvent
|
2018-01-12 20:13:10
|
CCAFS/MARLO
|
https://api.github.com/repos/CCAFS/MARLO
|
closed
|
Review compare history function
|
Priority - High Type -Task
|
- [x] IP: Outcomes
- [x] IP: Cluster Of Activities
- [x] Project: Description
- [x] Project: Partners
- [x] Project: Locations
- [x] Project: Contribution to CRP Outcomes
- [x] Project: Deliverable
- [x] Project: Activities
- [x] Project: Budget By Partners
- [x] Project: Budget by CoA
- [x] Funding Source
|
1.0
|
Review compare history function - - [x] IP: Outcomes
- [x] IP: Cluster Of Activities
- [x] Project: Description
- [x] Project: Partners
- [x] Project: Locations
- [x] Project: Contribution to CRP Outcomes
- [x] Project: Deliverable
- [x] Project: Activities
- [x] Project: Budget By Partners
- [x] Project: Budget by CoA
- [x] Funding Source
|
non_process
|
review compare history function ip outcomes ip cluster of activities project description project partners project locations project contribution to crp outcomes project deliverable project activities project budget by partners project budget by coa funding source
| 0
|
381,173
| 11,274,462,012
|
IssuesEvent
|
2020-01-14 18:37:11
|
rapsaGnauJ/Mega-Man-Space-2
|
https://api.github.com/repos/rapsaGnauJ/Mega-Man-Space-2
|
closed
|
[FEATURE REQUEST] Move fire() and shoot_projectile() methods to character.gd
|
feature high priority request
|
**Is your feature request related to a problem? Please describe.**
Enemies should be able to shoot as well as the player.
**Describe the solution you'd like**
Implement a fire() and shoot_projectile() in character.gd.
**Describe alternatives you've considered**
**Additional context**
|
1.0
|
[FEATURE REQUEST] Move fire() and shoot_projectile() methods to character.gd - **Is your feature request related to a problem? Please describe.**
Enemies should be able to shoot as well as the player.
**Describe the solution you'd like**
Implement a fire() and shoot_projectile() in character.gd.
**Describe alternatives you've considered**
**Additional context**
|
non_process
|
move fire and shoot projectile methods to character gd is your feature request related to a problem please describe enemies should be able to shoot as well as the player describe the solution you d like implement a fire and shoot projectile in character gd describe alternatives you ve considered additional context
| 0
|
17,483
| 23,300,441,729
|
IssuesEvent
|
2022-08-07 08:32:41
|
Arch666Angel/mods
|
https://api.github.com/repos/Arch666Angel/mods
|
closed
|
Angel's BIO special fishes block Landfill placement when Transport Drones is used
|
Impact: Bug Angels Bio Processing
|
Hello,
Sorry if this is already known/reported, here or anywhere else, I searched here and at the forums and didn't see mention, and that's why I posted this here.
I know this is probably a known issue with a change Wube made in the game some months ago, and it used to happen the same with vanilla Factorio's fishes, but they fixed it at the base game's level, I think in release 1.1.57, IIRC.
Problem is, it seems your modded fished still have this issue, and they block landfill from being applied over the area where the fishes are, generating lots of gameplay disruption when you need to landfill over a big area... since you don't have any choice but to grab the fish (decon them or picking them) or wait patiently until they move under an already filled tile freeing the ones that were blocked.
Problem is, for both of those workaround, you need to be near the the area to manually intervene or to activate the chunks, since the fishes don't move when there's no player (or spidertron) nearby, which kind of defeats the point of automating stuff (IMO, at least)
** Reproducibility **
To reproduce the issue, it's as simple as installing Transport Drones (could be an issue with other mods too) and Angel's Bio in a game, and try to apply landfill over an area that contains fishes... the landfills placement is shown as red and you can't apply it even with holding shift, it applies in the tiles surrounding the fish, but not where the fish is.
As for game versions, currently on last stable on Steam (1.1.61)
But the issue it's happening since they made the change on the base game, back in .55/.56 (I don't recall exactly) and they later released a patch where they fixed it for vanilla (in 1.1.57, I think?)
To verify this, I started a new game with only Angel's BIO (+dependencies ofc.) and Transport Drones loaded, and the issue happens, then I repeated the same but this time didn't activate Transport Drones, and the issue didn't happen.
2. Modlist
Modlist, as reported by the game in the current log, uploaded here -> https://pastebin.com/9zXqti80
**Screenshots**
Here's the issue at work in two screenshots in creative mode in-game (not the scenario editor, just with /editor mode active):
When attempting to place Landfill:

After the placement is done:

Best regards, and if more info is needed, I'll be glad to post anything missing.
EDIT: typos
|
1.0
|
Angel's BIO special fishes block Landfill placement when Transport Drones is used - Hello,
Sorry if this is already known/reported, here or anywhere else, I searched here and at the forums and didn't see mention, and that's why I posted this here.
I know this is probably a known issue with a change Wube made in the game some months ago, and it used to happen the same with vanilla Factorio's fishes, but they fixed it at the base game's level, I think in release 1.1.57, IIRC.
Problem is, it seems your modded fished still have this issue, and they block landfill from being applied over the area where the fishes are, generating lots of gameplay disruption when you need to landfill over a big area... since you don't have any choice but to grab the fish (decon them or picking them) or wait patiently until they move under an already filled tile freeing the ones that were blocked.
Problem is, for both of those workaround, you need to be near the the area to manually intervene or to activate the chunks, since the fishes don't move when there's no player (or spidertron) nearby, which kind of defeats the point of automating stuff (IMO, at least)
** Reproducibility **
To reproduce the issue, it's as simple as installing Transport Drones (could be an issue with other mods too) and Angel's Bio in a game, and try to apply landfill over an area that contains fishes... the landfills placement is shown as red and you can't apply it even with holding shift, it applies in the tiles surrounding the fish, but not where the fish is.
As for game versions, currently on last stable on Steam (1.1.61)
But the issue it's happening since they made the change on the base game, back in .55/.56 (I don't recall exactly) and they later released a patch where they fixed it for vanilla (in 1.1.57, I think?)
To verify this, I started a new game with only Angel's BIO (+dependencies ofc.) and Transport Drones loaded, and the issue happens, then I repeated the same but this time didn't activate Transport Drones, and the issue didn't happen.
2. Modlist
Modlist, as reported by the game in the current log, uploaded here -> https://pastebin.com/9zXqti80
**Screenshots**
Here's the issue at work in two screenshots in creative mode in-game (not the scenario editor, just with /editor mode active):
When attempting to place Landfill:

After the placement is done:

Best regards, and if more info is needed, I'll be glad to post anything missing.
EDIT: typos
|
process
|
angel s bio special fishes block landfill placement when transport drones is used hello sorry if this is already known reported here or anywhere else i searched here and at the forums and didn t see mention and that s why i posted this here i know this is probably a known issue with a change wube made in the game some months ago and it used to happen the same with vanilla factorio s fishes but they fixed it at the base game s level i think in release iirc problem is it seems your modded fished still have this issue and they block landfill from being applied over the area where the fishes are generating lots of gameplay disruption when you need to landfill over a big area since you don t have any choice but to grab the fish decon them or picking them or wait patiently until they move under an already filled tile freeing the ones that were blocked problem is for both of those workaround you need to be near the the area to manually intervene or to activate the chunks since the fishes don t move when there s no player or spidertron nearby which kind of defeats the point of automating stuff imo at least reproducibility to reproduce the issue it s as simple as installing transport drones could be an issue with other mods too and angel s bio in a game and try to apply landfill over an area that contains fishes the landfills placement is shown as red and you can t apply it even with holding shift it applies in the tiles surrounding the fish but not where the fish is as for game versions currently on last stable on steam but the issue it s happening since they made the change on the base game back in i don t recall exactly and they later released a patch where they fixed it for vanilla in i think to verify this i started a new game with only angel s bio dependencies ofc and transport drones loaded and the issue happens then i repeated the same but this time didn t activate transport drones and the issue didn t happen modlist modlist as reported by the game in the current log uploaded here screenshots here s the issue at work in two screenshots in creative mode in game not the scenario editor just with editor mode active when attempting to place landfill after the placement is done best regards and if more info is needed i ll be glad to post anything missing edit typos
| 1
|
19,594
| 25,945,773,126
|
IssuesEvent
|
2022-12-17 00:38:10
|
NationalSecurityAgency/ghidra
|
https://api.github.com/repos/NationalSecurityAgency/ghidra
|
closed
|
Register confusion and misinterpretation leading to incorrect decompilation
|
Type: Bug Feature: Processor/x86
|
**Describe the bug**
When decompiling an x86-16 bit protected mode Windows NE application that also uses 32-bit registers (ie `EAX`, etc), the decompiler appears to be treating them as extended versions of `SI` register. Earlier in this function `SI` is loaded from one structure member (1st and last lines below) and it can also be seen that `local_6` is set from a multiplication of 2 other members:
```
11c8:09e1 c4 5e 06 00a LES BX,[BP + pStateMgr]
11c8:09e4 66 26 0f 00a MOVZX EAX,word ptr ES:[BX]
b7 07
11c8:09e9 66 26 8b 00a MOV EDX,dword ptr ES:[BX + 0x10]
57 10
11c8:09ee 66 0f af d0 00a IMUL EDX,EAX
11c8:09f2 66 89 56 fc 00a MOV dword ptr [BP + local_6],EDX
11c8:09f6 26 8b 77 02 00a MOV SI,word ptr ES:[BX + 0x2]
```
Later the assembly is clearly assigning a value to `EAX` from `SI` and comparing that with a stack variable `local_6`:
```
11c8:09fe 66 0f b7 c6 00a MOVZX EAX,SI
EAX = INT_ZEXT SI
11c8:0a02 66 3b 46 fc 00a CMP EAX,dword ptr [BP + local_6]
$U1100:2 = INT_ADD BP, 0xfffc:2
$U3400:4 = CALLOTHER "segment", SS, $U1100:2
$U3c80:4 = LOAD ram($U3400:4)
CF = INT_LESS EAX, $U3c80:4
$U3c80:4 = LOAD ram($U3400:4)
OF = INT_SBORROW EAX, $U3c80:4
$U3c80:4 = LOAD ram($U3400:4)
$U1ee00:4 = INT_SUB EAX, $U3c80:4
SF = INT_SLESS $U1ee00:4, 0:4
ZF = INT_EQUAL $U1ee00:4, 0:4
$Udc80:4 = INT_AND $U1ee00:4, 0xff:4
$Udd00:1 = POPCOUNT $Udc80:4
$Udd80:1 = INT_AND $Udd00:1, 1:1
PF = INT_EQUAL $Udd80:1, 0:1
11c8:0a06 76 03 00a JBE LAB_11c8_0a0b
$U8080:1 = BOOL_OR CF, ZF
CBRANCH *[ram]0x11c80a0b:4, $U8080:1
11c8:0a08 8b 76 fc 00a MOV SI,word ptr [BP + local_6]
$U1100:2 = INT_ADD BP, 0xfffc:2
$U3400:4 = CALLOTHER "segment", SS, $U1100:2
$U7900:2 = LOAD ram($U3400:4)
SI = COPY $U7900:2
LAB_11c8_0a0b XREF[1]: 11c8:0a06(j)
11c8:0a0b 56 00a PUSH SI
$U9400:2 = COPY SI
SP = INT_SUB SP, 2:2
$U9580:4 = CALLOTHER "segment", SS, SP
STORE ram($U9580:4), $U9400:2
```
However, the C produced is:
```
if (local_6 < (uint)local_6) {
}
```
indicating that it has somehow treated `EAX` as a copy of `local_6` instead of a 32-bit version of the `SI` register. It subsequenty deems the assignment at `11c8:0a08` (the interior of the above `if ...` statement) to be superfluous.
The initial assignment to `SI` does use the following C decompilation:
`local_6._0_2_ = pStateMgr->m_nBytes_0x2;`
which seems to be corroberating the logic for removing the update in the above `if ...` statement! It should be a different variable.
**To Reproduce**
Steps to reproduce the behavior:
1. Load the provided file (created from the "Debug Function Decompilation" menu
2. See the issue
**Expected behavior**
To be able to correctly identify the `SI` (and `DI`) register(s) as independent from those extended registers and treat them as separate variables as to produce correct decompilaton. (NB: `DI` and `SI` registers are often used together with `DS` and `ES` (sometime `SS`) segment registers as 32-bit addresses.)
**Screenshots**

**Attachments**
[dragon_WriteStateArray.zip](https://github.com/NationalSecurityAgency/ghidra/files/9358717/dragon_WriteStateArray.zip)
**Environment (please complete the following information):**
- OS: Windows 11
- Java Version: 17.0.3.1
- Ghidra Version: [e.g. 10.2-DEV
- Ghidra Origin: locally built commit 0590f9336ee61f8c60a7a7f5423646eb5cb00861
-
**Additional context**
(Possibly related to https://github.com/NationalSecurityAgency/ghidra/issues/4529.)
|
1.0
|
Register confusion and misinterpretation leading to incorrect decompilation - **Describe the bug**
When decompiling an x86-16 bit protected mode Windows NE application that also uses 32-bit registers (ie `EAX`, etc), the decompiler appears to be treating them as extended versions of `SI` register. Earlier in this function `SI` is loaded from one structure member (1st and last lines below) and it can also be seen that `local_6` is set from a multiplication of 2 other members:
```
11c8:09e1 c4 5e 06 00a LES BX,[BP + pStateMgr]
11c8:09e4 66 26 0f 00a MOVZX EAX,word ptr ES:[BX]
b7 07
11c8:09e9 66 26 8b 00a MOV EDX,dword ptr ES:[BX + 0x10]
57 10
11c8:09ee 66 0f af d0 00a IMUL EDX,EAX
11c8:09f2 66 89 56 fc 00a MOV dword ptr [BP + local_6],EDX
11c8:09f6 26 8b 77 02 00a MOV SI,word ptr ES:[BX + 0x2]
```
Later the assembly is clearly assigning a value to `EAX` from `SI` and comparing that with a stack variable `local_6`:
```
11c8:09fe 66 0f b7 c6 00a MOVZX EAX,SI
EAX = INT_ZEXT SI
11c8:0a02 66 3b 46 fc 00a CMP EAX,dword ptr [BP + local_6]
$U1100:2 = INT_ADD BP, 0xfffc:2
$U3400:4 = CALLOTHER "segment", SS, $U1100:2
$U3c80:4 = LOAD ram($U3400:4)
CF = INT_LESS EAX, $U3c80:4
$U3c80:4 = LOAD ram($U3400:4)
OF = INT_SBORROW EAX, $U3c80:4
$U3c80:4 = LOAD ram($U3400:4)
$U1ee00:4 = INT_SUB EAX, $U3c80:4
SF = INT_SLESS $U1ee00:4, 0:4
ZF = INT_EQUAL $U1ee00:4, 0:4
$Udc80:4 = INT_AND $U1ee00:4, 0xff:4
$Udd00:1 = POPCOUNT $Udc80:4
$Udd80:1 = INT_AND $Udd00:1, 1:1
PF = INT_EQUAL $Udd80:1, 0:1
11c8:0a06 76 03 00a JBE LAB_11c8_0a0b
$U8080:1 = BOOL_OR CF, ZF
CBRANCH *[ram]0x11c80a0b:4, $U8080:1
11c8:0a08 8b 76 fc 00a MOV SI,word ptr [BP + local_6]
$U1100:2 = INT_ADD BP, 0xfffc:2
$U3400:4 = CALLOTHER "segment", SS, $U1100:2
$U7900:2 = LOAD ram($U3400:4)
SI = COPY $U7900:2
LAB_11c8_0a0b XREF[1]: 11c8:0a06(j)
11c8:0a0b 56 00a PUSH SI
$U9400:2 = COPY SI
SP = INT_SUB SP, 2:2
$U9580:4 = CALLOTHER "segment", SS, SP
STORE ram($U9580:4), $U9400:2
```
However, the C produced is:
```
if (local_6 < (uint)local_6) {
}
```
indicating that it has somehow treated `EAX` as a copy of `local_6` instead of a 32-bit version of the `SI` register. It subsequenty deems the assignment at `11c8:0a08` (the interior of the above `if ...` statement) to be superfluous.
The initial assignment to `SI` does use the following C decompilation:
`local_6._0_2_ = pStateMgr->m_nBytes_0x2;`
which seems to be corroberating the logic for removing the update in the above `if ...` statement! It should be a different variable.
**To Reproduce**
Steps to reproduce the behavior:
1. Load the provided file (created from the "Debug Function Decompilation" menu
2. See the issue
**Expected behavior**
To be able to correctly identify the `SI` (and `DI`) register(s) as independent from those extended registers and treat them as separate variables as to produce correct decompilaton. (NB: `DI` and `SI` registers are often used together with `DS` and `ES` (sometime `SS`) segment registers as 32-bit addresses.)
**Screenshots**

**Attachments**
[dragon_WriteStateArray.zip](https://github.com/NationalSecurityAgency/ghidra/files/9358717/dragon_WriteStateArray.zip)
**Environment (please complete the following information):**
- OS: Windows 11
- Java Version: 17.0.3.1
- Ghidra Version: [e.g. 10.2-DEV
- Ghidra Origin: locally built commit 0590f9336ee61f8c60a7a7f5423646eb5cb00861
-
**Additional context**
(Possibly related to https://github.com/NationalSecurityAgency/ghidra/issues/4529.)
|
process
|
register confusion and misinterpretation leading to incorrect decompilation describe the bug when decompiling an bit protected mode windows ne application that also uses bit registers ie eax etc the decompiler appears to be treating them as extended versions of si register earlier in this function si is loaded from one structure member and last lines below and it can also be seen that local is set from a multiplication of other members les bx movzx eax word ptr es mov edx dword ptr es af imul edx eax fc mov dword ptr edx mov si word ptr es later the assembly is clearly assigning a value to eax from si and comparing that with a stack variable local movzx eax si eax int zext si fc cmp eax dword ptr int add bp callother segment ss load ram cf int less eax load ram of int sborrow eax load ram int sub eax sf int sless zf int equal int and popcount int and pf int equal jbe lab bool or cf zf cbranch fc mov si word ptr int add bp callother segment ss load ram si copy lab xref j push si copy si sp int sub sp callother segment ss sp store ram however the c produced is if local uint local indicating that it has somehow treated eax as a copy of local instead of a bit version of the si register it subsequenty deems the assignment at the interior of the above if statement to be superfluous the initial assignment to si does use the following c decompilation local pstatemgr m nbytes which seems to be corroberating the logic for removing the update in the above if statement it should be a different variable to reproduce steps to reproduce the behavior load the provided file created from the debug function decompilation menu see the issue expected behavior to be able to correctly identify the si and di register s as independent from those extended registers and treat them as separate variables as to produce correct decompilaton nb di and si registers are often used together with ds and es sometime ss segment registers as bit addresses screenshots attachments environment please complete the following information os windows java version ghidra version e g dev ghidra origin locally built commit additional context possibly related to
| 1
|
7,336
| 2,847,620,656
|
IssuesEvent
|
2015-05-29 18:01:31
|
mozilla/teach.webmaker.org
|
https://api.github.com/repos/mozilla/teach.webmaker.org
|
closed
|
Main image on Events page needs to be re-cropped
|
design
|
Firefox on Android
The main image should be centered better if possible - seems like the action is all off frame
https://www.dropbox.com/s/247cu7rurupmho5/Screenshot_2015-04-01-16-09-16.png
|
1.0
|
Main image on Events page needs to be re-cropped - Firefox on Android
The main image should be centered better if possible - seems like the action is all off frame
https://www.dropbox.com/s/247cu7rurupmho5/Screenshot_2015-04-01-16-09-16.png
|
non_process
|
main image on events page needs to be re cropped firefox on android the main image should be centered better if possible seems like the action is all off frame
| 0
|
419,367
| 12,222,594,902
|
IssuesEvent
|
2020-05-02 13:56:39
|
iza-institute-of-labor-economics/gettsim
|
https://api.github.com/repos/iza-institute-of-labor-economics/gettsim
|
opened
|
Werbungskostenabzug fpr ALG2 needs to be corrected
|
bug priority-low tax-transfer-feature
|
### Bug description
When calculating the relevant income for ALG2, there used to be fixed amount of 15,33€ to be deducted for people with income from employment (§6 SGB II). This ruling changed on 2016-08-01 to:
> (1) Als Pauschbeträge sind abzusetzen
[...]
> 3. von dem Einkommen Leistungsberechtigter monatlich ein Betrag in Höhe eines Zwölftels der zum Zeitpunkt der Entscheidung über den Leistungsanspruch nachgewiesenen Jahresbeiträge zu den gesetzlich vorgeschriebenen Versicherungen nach § 11b Absatz 1 Satz 1 Nummer 3 des Zweiten Buches Sozialgesetzbuch,
This needs to be corrected.
|
1.0
|
Werbungskostenabzug fpr ALG2 needs to be corrected - ### Bug description
When calculating the relevant income for ALG2, there used to be fixed amount of 15,33€ to be deducted for people with income from employment (§6 SGB II). This ruling changed on 2016-08-01 to:
> (1) Als Pauschbeträge sind abzusetzen
[...]
> 3. von dem Einkommen Leistungsberechtigter monatlich ein Betrag in Höhe eines Zwölftels der zum Zeitpunkt der Entscheidung über den Leistungsanspruch nachgewiesenen Jahresbeiträge zu den gesetzlich vorgeschriebenen Versicherungen nach § 11b Absatz 1 Satz 1 Nummer 3 des Zweiten Buches Sozialgesetzbuch,
This needs to be corrected.
|
non_process
|
werbungskostenabzug fpr needs to be corrected bug description when calculating the relevant income for there used to be fixed amount of € to be deducted for people with income from employment § sgb ii this ruling changed on to als pauschbeträge sind abzusetzen von dem einkommen leistungsberechtigter monatlich ein betrag in höhe eines zwölftels der zum zeitpunkt der entscheidung über den leistungsanspruch nachgewiesenen jahresbeiträge zu den gesetzlich vorgeschriebenen versicherungen nach § absatz satz nummer des zweiten buches sozialgesetzbuch this needs to be corrected
| 0
|
13,294
| 15,768,830,073
|
IssuesEvent
|
2021-03-31 17:39:26
|
Maps4HTML/HTML-Map-Element-UseCases-Requirements
|
https://api.github.com/repos/Maps4HTML/HTML-Map-Element-UseCases-Requirements
|
closed
|
Migrate from Travis CI to Github Actions
|
meta/process
|
I think I've addressed this with a Github Action! It would be good to merge some of the small lingering PRs to make sure this is functioning as desired before we remove Travis and close this one out. :)
|
1.0
|
Migrate from Travis CI to Github Actions - I think I've addressed this with a Github Action! It would be good to merge some of the small lingering PRs to make sure this is functioning as desired before we remove Travis and close this one out. :)
|
process
|
migrate from travis ci to github actions i think i ve addressed this with a github action it would be good to merge some of the small lingering prs to make sure this is functioning as desired before we remove travis and close this one out
| 1
|
538,049
| 15,761,530,261
|
IssuesEvent
|
2021-03-31 10:04:49
|
yalla-coop/chiltern-music-therapy
|
https://api.github.com/repos/yalla-coop/chiltern-music-therapy
|
closed
|
Go back button
|
backlog client-reviewed components front-end priority-2
|
__Wireframe link__
https://www.figma.com/file/CcYmhfnXreAPxlfyEmGsAH/Chiltern-Music-Therapy?node-id=1451%3A23016
---
### Acceptance Criteria:
_REMEMBER THAT WHOEVER WORKS ON THIS ISSUE MUST TICK OFF ALL THE POINTS IN THIS LIST UNLESS THERE IS CLEAR AGREEMENT IN THE COMMENTS TO SAY OTHERWISE. **DO NOT REVIEW A PR INVOLVING THIS ISSUE UNLESS THIS HAS BEEN DONE**_
- [ ] Always go back one in browser history if clicked
- [ ] Include common set margin component > #32
- [ ] Add to storybook
|
1.0
|
Go back button - __Wireframe link__
https://www.figma.com/file/CcYmhfnXreAPxlfyEmGsAH/Chiltern-Music-Therapy?node-id=1451%3A23016
---
### Acceptance Criteria:
_REMEMBER THAT WHOEVER WORKS ON THIS ISSUE MUST TICK OFF ALL THE POINTS IN THIS LIST UNLESS THERE IS CLEAR AGREEMENT IN THE COMMENTS TO SAY OTHERWISE. **DO NOT REVIEW A PR INVOLVING THIS ISSUE UNLESS THIS HAS BEEN DONE**_
- [ ] Always go back one in browser history if clicked
- [ ] Include common set margin component > #32
- [ ] Add to storybook
|
non_process
|
go back button wireframe link acceptance criteria remember that whoever works on this issue must tick off all the points in this list unless there is clear agreement in the comments to say otherwise do not review a pr involving this issue unless this has been done always go back one in browser history if clicked include common set margin component add to storybook
| 0
|
15,197
| 18,988,450,110
|
IssuesEvent
|
2021-11-22 02:09:24
|
dials/dials
|
https://api.github.com/repos/dials/dials
|
opened
|
processing in relative path
|
enhancement dials.import dials.stills_process
|
I noticed that `dials.import` automatically converts provided paths to absolute paths. The exact details depend on the version.
- For masks, DIALS 3.3.0 keeps relative paths, but https://github.com/dials/dxtbx/commit/1049af2839f81bf3fe05ba36b660656ab2bbdb5f made it to absolute paths in newer versions.
- For images, `dials.import template=images_######.cbf` keeps the template as is, while `dials.import image_000001.cbf` automatically guesses the template in an absolute path. This `template` trick cannot be used when importing multi-image files (e.g. EIGER, Verlox EMD).
- `dials.stills_process` imports in absolute paths.
I would prefer programs to keep the paths as given (or to have an option to do so). In some HPC environments, images and processing folders are moved from hot storage to longer term storage certain time after data collection. Or locally, you might move old data to external disks. With absolute paths, I can no longer inspect *.expt files in `dials.image_viewer`.
|
1.0
|
processing in relative path - I noticed that `dials.import` automatically converts provided paths to absolute paths. The exact details depend on the version.
- For masks, DIALS 3.3.0 keeps relative paths, but https://github.com/dials/dxtbx/commit/1049af2839f81bf3fe05ba36b660656ab2bbdb5f made it to absolute paths in newer versions.
- For images, `dials.import template=images_######.cbf` keeps the template as is, while `dials.import image_000001.cbf` automatically guesses the template in an absolute path. This `template` trick cannot be used when importing multi-image files (e.g. EIGER, Verlox EMD).
- `dials.stills_process` imports in absolute paths.
I would prefer programs to keep the paths as given (or to have an option to do so). In some HPC environments, images and processing folders are moved from hot storage to longer term storage certain time after data collection. Or locally, you might move old data to external disks. With absolute paths, I can no longer inspect *.expt files in `dials.image_viewer`.
|
process
|
processing in relative path i noticed that dials import automatically converts provided paths to absolute paths the exact details depend on the version for masks dials keeps relative paths but made it to absolute paths in newer versions for images dials import template images cbf keeps the template as is while dials import image cbf automatically guesses the template in an absolute path this template trick cannot be used when importing multi image files e g eiger verlox emd dials stills process imports in absolute paths i would prefer programs to keep the paths as given or to have an option to do so in some hpc environments images and processing folders are moved from hot storage to longer term storage certain time after data collection or locally you might move old data to external disks with absolute paths i can no longer inspect expt files in dials image viewer
| 1
|
22,514
| 31,564,461,503
|
IssuesEvent
|
2023-09-03 16:49:28
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
ioslib 1.7.34 has 1 guarddog issues
|
npm-silent-process-execution
|
```{"npm-silent-process-execution":[{"code":"\t\t\t\t\t\tvar child = spawn(handle.simulator, ['-CurrentDeviceUDID', handle.udid], { detached: true, stdio: 'ignore' });","location":"package/lib/simulator.js:1259","message":"This package is silently executing another executable"}]}```
|
1.0
|
ioslib 1.7.34 has 1 guarddog issues - ```{"npm-silent-process-execution":[{"code":"\t\t\t\t\t\tvar child = spawn(handle.simulator, ['-CurrentDeviceUDID', handle.udid], { detached: true, stdio: 'ignore' });","location":"package/lib/simulator.js:1259","message":"This package is silently executing another executable"}]}```
|
process
|
ioslib has guarddog issues npm silent process execution detached true stdio ignore location package lib simulator js message this package is silently executing another executable
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.