Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
695,468
23,859,335,144
IssuesEvent
2022-09-07 05:02:34
nv-morpheus/Morpheus
https://api.github.com/repos/nv-morpheus/Morpheus
closed
[BUG] KafkaSourceStage: input_topic & group_id default values no longer relevant
bug Priority 1
**Describe the bug** Current default for `group_id` in both the class and the cli is "custreamz" should be "morpheus" Current default value for `input_topic` is "test_pcap", this should be a required argument without a default value.
1.0
[BUG] KafkaSourceStage: input_topic & group_id default values no longer relevant - **Describe the bug** Current default for `group_id` in both the class and the cli is "custreamz" should be "morpheus" Current default value for `input_topic` is "test_pcap", this should be a required argument without a default value.
non_process
kafkasourcestage input topic group id default values no longer relevant describe the bug current default for group id in both the class and the cli is custreamz should be morpheus current default value for input topic is test pcap this should be a required argument without a default value
0
1,696
4,346,122,902
IssuesEvent
2016-07-29 15:01:15
OpenBitcoinPrivacyProject/wallet-ratings
https://api.github.com/repos/OpenBitcoinPrivacyProject/wallet-ratings
opened
Revisit criterion for number of clicks to import a private key into an identity container
criteria easy-to-process
OBPPV3-CR45 is: > Number of clicks to assign an imported private key into an identity container It is only assigned to one countermeasure/attack: > Use multiple identities/accounts to allow funds associated with one transaction participant to be kept apart from funds associated with a different transaction participant > Collude with other transaction participants to infer a bitcoin user's behavior based on the flow of funds from one colluding entity, to the targeted user, to another colluding entity There is another, better criteria already listed under that countermeasure, OBPPV3-CR44: > Number of clicks to create a new identity container from the home screen of an existing identity container Problems with OBPPV3-CR45 were highlighted during our last edition testing phase. It’s not possible to accomplish this goal in zero clicks, it’s ostensibly a corner case, and just overall not a well-formed measurement of privacy. I suggest we simply delete it.
1.0
Revisit criterion for number of clicks to import a private key into an identity container - OBPPV3-CR45 is: > Number of clicks to assign an imported private key into an identity container It is only assigned to one countermeasure/attack: > Use multiple identities/accounts to allow funds associated with one transaction participant to be kept apart from funds associated with a different transaction participant > Collude with other transaction participants to infer a bitcoin user's behavior based on the flow of funds from one colluding entity, to the targeted user, to another colluding entity There is another, better criteria already listed under that countermeasure, OBPPV3-CR44: > Number of clicks to create a new identity container from the home screen of an existing identity container Problems with OBPPV3-CR45 were highlighted during our last edition testing phase. It’s not possible to accomplish this goal in zero clicks, it’s ostensibly a corner case, and just overall not a well-formed measurement of privacy. I suggest we simply delete it.
process
revisit criterion for number of clicks to import a private key into an identity container is number of clicks to assign an imported private key into an identity container it is only assigned to one countermeasure attack use multiple identities accounts to allow funds associated with one transaction participant to be kept apart from funds associated with a different transaction participant collude with other transaction participants to infer a bitcoin user s behavior based on the flow of funds from one colluding entity to the targeted user to another colluding entity there is another better criteria already listed under that countermeasure number of clicks to create a new identity container from the home screen of an existing identity container problems with were highlighted during our last edition testing phase it’s not possible to accomplish this goal in zero clicks it’s ostensibly a corner case and just overall not a well formed measurement of privacy i suggest we simply delete it
1
5,194
7,973,972,253
IssuesEvent
2018-07-17 02:29:15
pelias/pelias
https://api.github.com/repos/pelias/pelias
closed
Check for housenumber === street in openaddresses imports
processed
OpenAddresses has problematic data, see [here](https://github.com/openaddresses/openaddresses/issues/1841), and we currently import as is. This results in a less than awesome experience when searching for stuff like [this](http://pelias.github.io/compare/#/v1/search%3Ftext=4th%20street). Maybe we can at least check for this common case where the housenumber matches the street name and skip those addresses? cc @trescube
1.0
Check for housenumber === street in openaddresses imports - OpenAddresses has problematic data, see [here](https://github.com/openaddresses/openaddresses/issues/1841), and we currently import as is. This results in a less than awesome experience when searching for stuff like [this](http://pelias.github.io/compare/#/v1/search%3Ftext=4th%20street). Maybe we can at least check for this common case where the housenumber matches the street name and skip those addresses? cc @trescube
process
check for housenumber street in openaddresses imports openaddresses has problematic data see and we currently import as is this results in a less than awesome experience when searching for stuff like maybe we can at least check for this common case where the housenumber matches the street name and skip those addresses cc trescube
1
205,543
23,343,842,528
IssuesEvent
2022-08-09 16:04:09
MatBenfield/news
https://api.github.com/repos/MatBenfield/news
closed
[SecurityWeek] Twilio Hacked After Employees Tricked Into Giving Up Login Credentials
SecurityWeek Stale
**Enterprise software vendor Twilio (NYSE: TWLO) has been hacked by a relentless threat actor who successfully tricked employees into giving up login credentials that were then used to steal third-party customer data.** [read more](https://www.securityweek.com/twilio-hacked-after-employees-tricked-giving-login-credentials) <https://www.securityweek.com/twilio-hacked-after-employees-tricked-giving-login-credentials>
True
[SecurityWeek] Twilio Hacked After Employees Tricked Into Giving Up Login Credentials - **Enterprise software vendor Twilio (NYSE: TWLO) has been hacked by a relentless threat actor who successfully tricked employees into giving up login credentials that were then used to steal third-party customer data.** [read more](https://www.securityweek.com/twilio-hacked-after-employees-tricked-giving-login-credentials) <https://www.securityweek.com/twilio-hacked-after-employees-tricked-giving-login-credentials>
non_process
twilio hacked after employees tricked into giving up login credentials enterprise software vendor twilio nyse twlo has been hacked by a relentless threat actor who successfully tricked employees into giving up login credentials that were then used to steal third party customer data
0
3,419
6,524,320,073
IssuesEvent
2017-08-29 12:14:01
kmycode/storycanvas-csharp
https://api.github.com/repos/kmycode/storycanvas-csharp
closed
人物のマップ機能
enhancement priority-middle processing
* 人物を「マップ」という名前の二次元空間上に配置 * 人物同士の関連付けを矢印で表現 * マップは複数作ることができ、同じ人物を複数のマップに登場させることも可能 * タップやドラッグドロップで人物の移動・編集・関連付け・削除が可能
1.0
人物のマップ機能 - * 人物を「マップ」という名前の二次元空間上に配置 * 人物同士の関連付けを矢印で表現 * マップは複数作ることができ、同じ人物を複数のマップに登場させることも可能 * タップやドラッグドロップで人物の移動・編集・関連付け・削除が可能
process
人物のマップ機能 人物を「マップ」という名前の二次元空間上に配置 人物同士の関連付けを矢印で表現 マップは複数作ることができ、同じ人物を複数のマップに登場させることも可能 タップやドラッグドロップで人物の移動・編集・関連付け・削除が可能
1
16,416
21,192,898,171
IssuesEvent
2022-04-08 19:40:35
googleapis/python-bigquery
https://api.github.com/repos/googleapis/python-bigquery
closed
refactor AccessEntry to use `_properties` pattern
api: bigquery type: process
**Is your feature request related to a problem? Please describe.** Often there is a private preview feature (or just a new feature that we haven't implemented yet) and we want to provide our customers a workaround to be able to send a resource as represented by JSON. Then we could have provided a workaround to the user in https://github.com/googleapis/python-bigquery/issues/1064 like the following: ``` entry = AccessEntry.from_api_repr({ "role": "READER", "dataset": { "projectId": "project-id", "datasetId": "dataset_id", } }) ``` or even ``` entry = AccessEntry("READER") entry._properties["dataset"] = { "projectId": "project-id", "datasetId": "dataset_id", } ``` and been confident that it would send the correct values when making the API request. Likewise, one could read values from the API like `entry._properties["dataset"]` when checking values from the API. **Describe the solution you'd like** The way we handle `entity_type` and `entity_id` is reminiscent of `ExternalConfig.options`, which I refactored in https://github.com/googleapis/python-bigquery/pull/994. It was clever at the time when there were only a few external data formats, but has diverged far the actual API representation. `AccessEntry` should have separate properties for `view`, `routine`, `dataset`, etc. These could accept and return relevant types (e.g. TableReference) as well. Existing `entity_type` and `entity_id` should be made optional in the constructor, but in the same order so that backwards compatibility is maintained. **Describe alternatives you've considered** Leaving the class as-is should continue to work, but it's a significant risk IMO. **Additional context** As identified in https://github.com/googleapis/python-bigquery/pull/1075 See similar work in https://github.com/googleapis/python-bigquery/pull/994
1.0
refactor AccessEntry to use `_properties` pattern - **Is your feature request related to a problem? Please describe.** Often there is a private preview feature (or just a new feature that we haven't implemented yet) and we want to provide our customers a workaround to be able to send a resource as represented by JSON. Then we could have provided a workaround to the user in https://github.com/googleapis/python-bigquery/issues/1064 like the following: ``` entry = AccessEntry.from_api_repr({ "role": "READER", "dataset": { "projectId": "project-id", "datasetId": "dataset_id", } }) ``` or even ``` entry = AccessEntry("READER") entry._properties["dataset"] = { "projectId": "project-id", "datasetId": "dataset_id", } ``` and been confident that it would send the correct values when making the API request. Likewise, one could read values from the API like `entry._properties["dataset"]` when checking values from the API. **Describe the solution you'd like** The way we handle `entity_type` and `entity_id` is reminiscent of `ExternalConfig.options`, which I refactored in https://github.com/googleapis/python-bigquery/pull/994. It was clever at the time when there were only a few external data formats, but has diverged far the actual API representation. `AccessEntry` should have separate properties for `view`, `routine`, `dataset`, etc. These could accept and return relevant types (e.g. TableReference) as well. Existing `entity_type` and `entity_id` should be made optional in the constructor, but in the same order so that backwards compatibility is maintained. **Describe alternatives you've considered** Leaving the class as-is should continue to work, but it's a significant risk IMO. **Additional context** As identified in https://github.com/googleapis/python-bigquery/pull/1075 See similar work in https://github.com/googleapis/python-bigquery/pull/994
process
refactor accessentry to use properties pattern is your feature request related to a problem please describe often there is a private preview feature or just a new feature that we haven t implemented yet and we want to provide our customers a workaround to be able to send a resource as represented by json then we could have provided a workaround to the user in like the following entry accessentry from api repr role reader dataset projectid project id datasetid dataset id or even entry accessentry reader entry properties projectid project id datasetid dataset id and been confident that it would send the correct values when making the api request likewise one could read values from the api like entry properties when checking values from the api describe the solution you d like the way we handle entity type and entity id is reminiscent of externalconfig options which i refactored in it was clever at the time when there were only a few external data formats but has diverged far the actual api representation accessentry should have separate properties for view routine dataset etc these could accept and return relevant types e g tablereference as well existing entity type and entity id should be made optional in the constructor but in the same order so that backwards compatibility is maintained describe alternatives you ve considered leaving the class as is should continue to work but it s a significant risk imo additional context as identified in see similar work in
1
18,163
24,199,654,386
IssuesEvent
2022-09-24 11:27:45
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
conPty and winPty
info-needed terminal-process
<!-- ⚠️⚠️ Do Not Delete This! feature_request_template ⚠️⚠️ --> <!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ --> <!-- Please search existing issues to avoid creating duplicates. --> <!-- Describe the feature you'd like. --> 1. when I use VSCode, the configuration of Windows Enable Conpty has confused me a lot 2. my windows is win10, weather I use conPty or winPty, the terminal has no problem 3. when I use win11, if I choose conPty, the terminal will disconnect, If I use winPty, the terminal is OK, 4. can you tell me the difference of conPty and winPty? thank you a lot
1.0
conPty and winPty - <!-- ⚠️⚠️ Do Not Delete This! feature_request_template ⚠️⚠️ --> <!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ --> <!-- Please search existing issues to avoid creating duplicates. --> <!-- Describe the feature you'd like. --> 1. when I use VSCode, the configuration of Windows Enable Conpty has confused me a lot 2. my windows is win10, weather I use conPty or winPty, the terminal has no problem 3. when I use win11, if I choose conPty, the terminal will disconnect, If I use winPty, the terminal is OK, 4. can you tell me the difference of conPty and winPty? thank you a lot
process
conpty and winpty when i use vscode the configuration of windows enable conpty has confused me a lot my windows is weather i use conpty or winpty the terminal has no problem when i use if i choose conpty the terminal will disconnect if i use winpty the terminal is ok can you tell me the difference of conpty and winpty thank you a lot
1
4,210
4,891,634,977
IssuesEvent
2016-11-18 17:13:42
kbenoit/quanteda
https://api.github.com/repos/kbenoit/quanteda
closed
Ready corpus constructor for new API
infrastructure
Involves: - [x] Tidying up documentation - [x] moving to separate files - [x] adding tests
1.0
Ready corpus constructor for new API - Involves: - [x] Tidying up documentation - [x] moving to separate files - [x] adding tests
non_process
ready corpus constructor for new api involves tidying up documentation moving to separate files adding tests
0
312,045
23,414,046,924
IssuesEvent
2022-08-12 21:11:20
nebari-dev/nebari-docs
https://api.github.com/repos/nebari-dev/nebari-docs
opened
[DOC] Migrate day-to-day maintenance guide
area: documentation 📖
### Preliminary Checks - [X] This issue is not a question, feature request, RFC, or anything other than a bug report. Please post those things in GitHub Discussions: https://github.com/nebari-dev/nebari/discussions ### Summary [notion page](https://www.notion.so/quansightlabs/day-to-day-maintenance-9d468145edea428c93958e6b0c2f8ba9) Link to old docs: https://docs.qhub.dev/en/latest/source/admin_guide/system_maintenance.html The contents of this page needs to be migrated into the new docs. I don't think it should be added as its own page, but added to other existed pages (like a configuration guide) ### Steps to Resolve this Issue .
1.0
[DOC] Migrate day-to-day maintenance guide - ### Preliminary Checks - [X] This issue is not a question, feature request, RFC, or anything other than a bug report. Please post those things in GitHub Discussions: https://github.com/nebari-dev/nebari/discussions ### Summary [notion page](https://www.notion.so/quansightlabs/day-to-day-maintenance-9d468145edea428c93958e6b0c2f8ba9) Link to old docs: https://docs.qhub.dev/en/latest/source/admin_guide/system_maintenance.html The contents of this page needs to be migrated into the new docs. I don't think it should be added as its own page, but added to other existed pages (like a configuration guide) ### Steps to Resolve this Issue .
non_process
migrate day to day maintenance guide preliminary checks this issue is not a question feature request rfc or anything other than a bug report please post those things in github discussions summary link to old docs the contents of this page needs to be migrated into the new docs i don t think it should be added as its own page but added to other existed pages like a configuration guide steps to resolve this issue
0
14,912
18,296,459,190
IssuesEvent
2021-10-05 20:57:35
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Field alias different for metric and sort, causing invalid query on Snowflake
Type:Bug Priority:P2 Querying/Processor Database/Snowflake
**Describe the bug** When I want to sort my question by a metric (defined in the data model > metrics) I have this error: `SQL compilation error: error line 1 at position 1,535 invalid identifier '"Direct + Influenced Open Rate"'` My metrics has capital letters "Direct + Influenced Open Rate". The SQL request sent to my database Snowflake does not include the capital letters in the select statement but include the capital letters in the sort clause causing the invalid identifier issue. Here the request received by Snowflake from Metabase: `SELECT date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)) AS "STAT_DATE", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" AS "COMPANY_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME" AS "APP_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM" AS "PLATFORM", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" AS "CAMPAIGN_TYPE", (CAST(sum(("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."DIRECT_OPEN" + "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."INFLUENCED_OPEN")) AS float) / CASE WHEN sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") = 0 THEN NULL ELSE sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") END) AS "direct + influenced open rate", (CAST(sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."DIRECT_OPEN") AS float) / CASE WHEN sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") = 0 THEN NULL ELSE sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") END) AS "direct open rate" FROM "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS" WHERE ("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" = ? AND "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" >= ? AND "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" < ?) GROUP BY date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)), "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" ORDER BY "Direct + Influenced Open Rate" DESC, date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)) ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" ASC` **Logs** `[e8ce7813-0f33-40db-8c58-40f02f2ec031] 2021-10-04T10:42:11+02:00 DEBUG metabase.server.middleware.log GET /api/database/5/autocomplete_suggestions 200 8.4 ms (6 DB calls) App DB connections: 0/15 Jetty threads: 5/50 (7 idle, 0 queued) (121 total active threads) Queries in flight: 0 (0 queued) [e8ce7813-0f33-40db-8c58-40f02f2ec031] 2021-10-04T10:42:14+02:00 ERROR metabase.query-processor.middleware.catch-exceptions Error processing query: null {:database_id 5, :started_at #t "2021-10-04T08:42:13.397499Z[GMT]", :via [{:status :failed, :class clojure.lang.ExceptionInfo, :error "Error executing query", :stacktrace ["--> driver.sql_jdbc.execute$execute_reducible_query$fn__80803.invoke(execute.clj:480)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82292.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82292.invoke(sql_jdbc.clj:52)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47903.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47017.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47889.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46469.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48149.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50081.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45588.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41716.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41591.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46889.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__49010.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47090.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49307.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49620.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45164.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47853.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47834.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44458.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47156.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45975.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46692.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44813.invoke(add_dimension_projections.clj:312)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45042.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50030.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45327.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49182.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45535.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46739.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48992.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46791.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47540.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45336.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49983.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193$fn__49197.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47780.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49992$fn__49993.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:44)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49992.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50037.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47916.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45182.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49968.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47033.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49079.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46973.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38255$thunk__38256.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38255.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38264$fn__38267.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38264.invoke(reducible.clj:134)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:241)" "query_processor$process_userland_query.doInvoke(query_processor.clj:237)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136$fn__50139.invoke(query_processor.clj:253)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136.invoke(query_processor.clj:245)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180$fn__50183.invoke(query_processor.clj:265)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180.invoke(query_processor.clj:258)" "api.dataset$run_query_async$fn__56418.invoke(dataset.clj:56)" "query_processor.streaming$streaming_response_STAR_$fn__56397$fn__56398.invoke(streaming.clj:72)" "query_processor.streaming$streaming_response_STAR_$fn__56397.invoke(streaming.clj:71)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)" "async.streaming_response$do_f_async$fn__16088.invoke(streaming_response.clj:84)"], :error_type :invalid-query, :ex-data {:sql "-- Metabase:: userID: 6 queryType: native queryHash: a485998d869b9b6f42b1ba2d36aca7534609f12c39a68ffb10a1734f16d3e6e4\nwith daily as(\n select\n date_trunc('quarter', stat_date) as quarter\n ,app_name\n ,company_name\n ,platform\n ,campaign_type\n ,coalesce(sum(direct_open), 0) as direct_open\n ,coalesce(sum(influenced_open), 0) as influenced_open\n ,coalesce(sum(optin_sent), 0) as optin_sent\n from fact_daily_campaigns\n where true\n \n and \"BATCH\".\"PUBLIC\".\"FACT_DAILY_CAMPAIGNS\".\"COMPANY_NAME\" IN (?)\n \n \n group by 1, 2, 3, 4, 5\n)\n\nselect\nquarter\n,app_name\n,company_name\n,platform\n,campaign_type\n,direct_open / optin_sent as direct_open_rate\n,(direct_open + influenced_open) / optin_sent as direct_influenced_open_rate\nfrom daily\nwhere optin_sent > 0\nqualify row_number() over(partition by quarter, app_name, company_name, platform, campaign_type order by ?) <= ?)", :params ["Back Market" "direct_open_rate" "10"], :type :invalid-query}}], :state "42000", :error_type :invalid-query, :json_query {:database 5, :native {:template-tags {:app_name {:id "580454c4-6cae-2081-3172-9d14f49debd7", :name "app_name", :display-name "App Name", :type "dimension", :dimension ["field" 1529 nil], :widget-type "category"}, :company_name {:id "abf0a1b7-dc62-878c-6761-78cac99b8d23", :name "company_name", :display-name "Company Name", :type "dimension", :dimension ["field" 1522 nil], :widget-type "category"}, :platform {:id "600d05e0-66f1-15ad-61f3-4a822c643e4a", :name "platform", :display-name "Platform", :type "dimension", :dimension ["field" 1525 nil], :widget-type "category"}, :campaign_type {:id "cee2a022-fb06-6118-7de1-609f8327fc04", :name "campaign_type", :display-name "Campaign type", :type "dimension", :dimension ["field" 1531 nil], :widget-type "category"}, :rank_type {:id "46dd005e-815a-8232-b885-8295751a9950", :name "rank_type", :display-name "Rank Type", :type "text", :required true, :default "direct_open_rate"}, :rank {:id "df771ef3-0a31-11b5-8c81-6755a7658fe3", :name "rank", :display-name "Rank", :type "text", :required true, :default "10"}}, :query "with daily as(\n select\n date_trunc('quarter', stat_date) as quarter\n ,app_name\n ,company_name\n ,platform\n ,campaign_type\n ,coalesce(sum(direct_open), 0) as direct_open\n ,coalesce(sum(influenced_open), 0) as influenced_open\n ,coalesce(sum(optin_sent), 0) as optin_sent\n from fact_daily_campaigns\n where true\n [[ and {{app_name}}]]\n [[ and {{company_name}}]]\n [[ and {{platform}}]]\n [[ and {{campaign_type}}]]\n group by 1, 2, 3, 4, 5\n)\n\nselect\nquarter\n,app_name\n,company_name\n,platform\n,campaign_type\n,direct_open / optin_sent as direct_open_rate\n,(direct_open + influenced_open) / optin_sent as direct_influenced_open_rate\nfrom daily\nwhere optin_sent > 0\nqualify row_number() over(partition by quarter, app_name, company_name, platform, campaign_type order by {{rank_type}}) <= {{rank}})"}, :type "native", :parameters [{:type "category", :value ["Back Market"], :target ["dimension" ["template-tag" "company_name"]]}], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}}, :status :failed, :class net.snowflake.client.jdbc.SnowflakeSQLException, :stacktrace ["net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:124)" "net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:64)" "net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:434)" "net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:338)" "net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:501)" "net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:229)" "net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:167)" "net.snowflake.client.core.SFStatement.execute(SFStatement.java:749)" "net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:245)" "net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:117)" "com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)" "--> driver.sql_jdbc.execute$fn__80721.invokeStatic(execute.clj:340)" "driver.sql_jdbc.execute$fn__80721.invoke(execute.clj:338)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invokeStatic(execute.clj:353)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invoke(execute.clj:349)" "driver.sql_jdbc.execute$execute_reducible_query$fn__80803.invoke(execute.clj:478)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82292.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82292.invoke(sql_jdbc.clj:52)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47903.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47017.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47889.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46469.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48149.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50081.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45588.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41716.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41591.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46889.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__49010.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47090.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49307.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49620.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45164.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47853.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47834.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44458.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47156.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45975.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46692.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44813.invoke(add_dimension_projections.clj:312)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45042.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50030.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45327.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49182.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45535.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46739.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48992.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46791.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47540.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45336.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49983.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193$fn__49197.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47780.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49992$fn__49993.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:44)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49992.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50037.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47916.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45182.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49968.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47033.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49079.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46973.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38255$thunk__38256.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38255.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38264$fn__38267.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38264.invoke(reducible.clj:134)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:241)" "query_processor$process_userland_query.doInvoke(query_processor.clj:237)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136$fn__50139.invoke(query_processor.clj:253)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136.invoke(query_processor.clj:245)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180$fn__50183.invoke(query_processor.clj:265)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180.invoke(query_processor.clj:258)" "api.dataset$run_query_async$fn__56418.invoke(dataset.clj:56)" "query_processor.streaming$streaming_response_STAR_$fn__56397$fn__56398.invoke(streaming.clj:72)" "query_processor.streaming$streaming_response_STAR_$fn__56397.invoke(streaming.clj:71)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)" "async.streaming_response$do_f_async$fn__16088.invoke(streaming_response.clj:84)"], :context :ad-hoc, :error "SQL compilation error:\nsyntax error line 30 at position 112 unexpected ')'.", :row_count 0, :running_time 0, :data {:rows [], :cols []}}` **To Reproduce** Steps to reproduce the behavior: 1. Create a Metrics with capital letters in the name 2. Use this metric in a question 3. Sort on this metric **Metabase Diagnostic Info** ```json { "browser-info": { "language": "fr-FR", "platform": "MacIntel", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.61 Safari/537.36", "vendor": "Google Inc." }, "system-info": { "file.encoding": "UTF-8", "java.runtime.name": "OpenJDK Runtime Environment", "java.runtime.version": "11.0.12+7", "java.vendor": "Eclipse Foundation", "java.vendor.url": "https://adoptium.net/", "java.version": "11.0.12", "java.vm.name": "OpenJDK 64-Bit Server VM", "java.vm.version": "11.0.12+7", "os.name": "Linux", "os.version": "4.19.0-16-cloud-amd64", "user.language": "en", "user.timezone": "GMT" }, "metabase-info": { "databases": [ "snowflake" ], "hosting-env": "unknown", "application-database": "h2", "application-database-details": { "database": { "name": "H2", "version": "1.4.197 (2018-03-18)" }, "jdbc-driver": { "name": "H2 JDBC Driver", "version": "1.4.197 (2018-03-18)" } }, "run-mode": "prod", "version": { "date": "2021-09-09", "tag": "v0.40.4", "branch": "release-x.40.x", "hash": "16d2e53" }, "settings": { "report-timezone": "Europe/Paris" } } } ```
1.0
Field alias different for metric and sort, causing invalid query on Snowflake - **Describe the bug** When I want to sort my question by a metric (defined in the data model > metrics) I have this error: `SQL compilation error: error line 1 at position 1,535 invalid identifier '"Direct + Influenced Open Rate"'` My metrics has capital letters "Direct + Influenced Open Rate". The SQL request sent to my database Snowflake does not include the capital letters in the select statement but include the capital letters in the sort clause causing the invalid identifier issue. Here the request received by Snowflake from Metabase: `SELECT date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)) AS "STAT_DATE", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" AS "COMPANY_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME" AS "APP_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM" AS "PLATFORM", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" AS "CAMPAIGN_TYPE", (CAST(sum(("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."DIRECT_OPEN" + "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."INFLUENCED_OPEN")) AS float) / CASE WHEN sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") = 0 THEN NULL ELSE sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") END) AS "direct + influenced open rate", (CAST(sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."DIRECT_OPEN") AS float) / CASE WHEN sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") = 0 THEN NULL ELSE sum("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."OPTIN_SENT") END) AS "direct open rate" FROM "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS" WHERE ("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" = ? AND "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" >= ? AND "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" < ?) GROUP BY date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)), "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM", "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" ORDER BY "Direct + Influenced Open Rate" DESC, date_trunc("quarter", CAST("BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."STAT_DATE" AS timestamp)) ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."COMPANY_NAME" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."APP_NAME" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."PLATFORM" ASC, "BATCH"."PUBLIC"."FACT_DAILY_CAMPAIGNS"."CAMPAIGN_TYPE" ASC` **Logs** `[e8ce7813-0f33-40db-8c58-40f02f2ec031] 2021-10-04T10:42:11+02:00 DEBUG metabase.server.middleware.log GET /api/database/5/autocomplete_suggestions 200 8.4 ms (6 DB calls) App DB connections: 0/15 Jetty threads: 5/50 (7 idle, 0 queued) (121 total active threads) Queries in flight: 0 (0 queued) [e8ce7813-0f33-40db-8c58-40f02f2ec031] 2021-10-04T10:42:14+02:00 ERROR metabase.query-processor.middleware.catch-exceptions Error processing query: null {:database_id 5, :started_at #t "2021-10-04T08:42:13.397499Z[GMT]", :via [{:status :failed, :class clojure.lang.ExceptionInfo, :error "Error executing query", :stacktrace ["--> driver.sql_jdbc.execute$execute_reducible_query$fn__80803.invoke(execute.clj:480)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82292.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82292.invoke(sql_jdbc.clj:52)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47903.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47017.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47889.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46469.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48149.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50081.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45588.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41716.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41591.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46889.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__49010.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47090.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49307.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49620.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45164.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47853.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47834.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44458.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47156.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45975.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46692.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44813.invoke(add_dimension_projections.clj:312)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45042.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50030.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45327.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49182.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45535.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46739.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48992.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46791.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47540.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45336.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49983.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193$fn__49197.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47780.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49992$fn__49993.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:44)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49992.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50037.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47916.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45182.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49968.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47033.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49079.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46973.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38255$thunk__38256.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38255.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38264$fn__38267.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38264.invoke(reducible.clj:134)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:241)" "query_processor$process_userland_query.doInvoke(query_processor.clj:237)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136$fn__50139.invoke(query_processor.clj:253)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136.invoke(query_processor.clj:245)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180$fn__50183.invoke(query_processor.clj:265)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180.invoke(query_processor.clj:258)" "api.dataset$run_query_async$fn__56418.invoke(dataset.clj:56)" "query_processor.streaming$streaming_response_STAR_$fn__56397$fn__56398.invoke(streaming.clj:72)" "query_processor.streaming$streaming_response_STAR_$fn__56397.invoke(streaming.clj:71)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)" "async.streaming_response$do_f_async$fn__16088.invoke(streaming_response.clj:84)"], :error_type :invalid-query, :ex-data {:sql "-- Metabase:: userID: 6 queryType: native queryHash: a485998d869b9b6f42b1ba2d36aca7534609f12c39a68ffb10a1734f16d3e6e4\nwith daily as(\n select\n date_trunc('quarter', stat_date) as quarter\n ,app_name\n ,company_name\n ,platform\n ,campaign_type\n ,coalesce(sum(direct_open), 0) as direct_open\n ,coalesce(sum(influenced_open), 0) as influenced_open\n ,coalesce(sum(optin_sent), 0) as optin_sent\n from fact_daily_campaigns\n where true\n \n and \"BATCH\".\"PUBLIC\".\"FACT_DAILY_CAMPAIGNS\".\"COMPANY_NAME\" IN (?)\n \n \n group by 1, 2, 3, 4, 5\n)\n\nselect\nquarter\n,app_name\n,company_name\n,platform\n,campaign_type\n,direct_open / optin_sent as direct_open_rate\n,(direct_open + influenced_open) / optin_sent as direct_influenced_open_rate\nfrom daily\nwhere optin_sent > 0\nqualify row_number() over(partition by quarter, app_name, company_name, platform, campaign_type order by ?) <= ?)", :params ["Back Market" "direct_open_rate" "10"], :type :invalid-query}}], :state "42000", :error_type :invalid-query, :json_query {:database 5, :native {:template-tags {:app_name {:id "580454c4-6cae-2081-3172-9d14f49debd7", :name "app_name", :display-name "App Name", :type "dimension", :dimension ["field" 1529 nil], :widget-type "category"}, :company_name {:id "abf0a1b7-dc62-878c-6761-78cac99b8d23", :name "company_name", :display-name "Company Name", :type "dimension", :dimension ["field" 1522 nil], :widget-type "category"}, :platform {:id "600d05e0-66f1-15ad-61f3-4a822c643e4a", :name "platform", :display-name "Platform", :type "dimension", :dimension ["field" 1525 nil], :widget-type "category"}, :campaign_type {:id "cee2a022-fb06-6118-7de1-609f8327fc04", :name "campaign_type", :display-name "Campaign type", :type "dimension", :dimension ["field" 1531 nil], :widget-type "category"}, :rank_type {:id "46dd005e-815a-8232-b885-8295751a9950", :name "rank_type", :display-name "Rank Type", :type "text", :required true, :default "direct_open_rate"}, :rank {:id "df771ef3-0a31-11b5-8c81-6755a7658fe3", :name "rank", :display-name "Rank", :type "text", :required true, :default "10"}}, :query "with daily as(\n select\n date_trunc('quarter', stat_date) as quarter\n ,app_name\n ,company_name\n ,platform\n ,campaign_type\n ,coalesce(sum(direct_open), 0) as direct_open\n ,coalesce(sum(influenced_open), 0) as influenced_open\n ,coalesce(sum(optin_sent), 0) as optin_sent\n from fact_daily_campaigns\n where true\n [[ and {{app_name}}]]\n [[ and {{company_name}}]]\n [[ and {{platform}}]]\n [[ and {{campaign_type}}]]\n group by 1, 2, 3, 4, 5\n)\n\nselect\nquarter\n,app_name\n,company_name\n,platform\n,campaign_type\n,direct_open / optin_sent as direct_open_rate\n,(direct_open + influenced_open) / optin_sent as direct_influenced_open_rate\nfrom daily\nwhere optin_sent > 0\nqualify row_number() over(partition by quarter, app_name, company_name, platform, campaign_type order by {{rank_type}}) <= {{rank}})"}, :type "native", :parameters [{:type "category", :value ["Back Market"], :target ["dimension" ["template-tag" "company_name"]]}], :middleware {:js-int-to-string? true, :add-default-userland-constraints? true}}, :status :failed, :class net.snowflake.client.jdbc.SnowflakeSQLException, :stacktrace ["net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:124)" "net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:64)" "net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:434)" "net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:338)" "net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:501)" "net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:229)" "net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:167)" "net.snowflake.client.core.SFStatement.execute(SFStatement.java:749)" "net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:245)" "net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:117)" "com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)" "--> driver.sql_jdbc.execute$fn__80721.invokeStatic(execute.clj:340)" "driver.sql_jdbc.execute$fn__80721.invoke(execute.clj:338)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invokeStatic(execute.clj:353)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invoke(execute.clj:349)" "driver.sql_jdbc.execute$execute_reducible_query$fn__80803.invoke(execute.clj:478)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82292.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82292.invoke(sql_jdbc.clj:52)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47903.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47017.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47889.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46469.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48149.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50081.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45588.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41716.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41591.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46889.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__49010.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47090.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49307.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49620.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45164.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47853.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47834.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44458.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47156.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45975.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46692.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44813.invoke(add_dimension_projections.clj:312)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45042.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50030.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45327.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49182.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45535.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46739.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48992.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46791.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47540.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45336.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49983.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193$fn__49197.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49193.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47780.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49992$fn__49993.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:44)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49992.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50037.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47916.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45182.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49968.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47033.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49079.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46973.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38255$thunk__38256.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38255.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38264$fn__38267.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38264.invoke(reducible.clj:134)" "query_processor$process_userland_query.invokeStatic(query_processor.clj:241)" "query_processor$process_userland_query.doInvoke(query_processor.clj:237)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136$fn__50139.invoke(query_processor.clj:253)" "query_processor$fn__50127$process_query_and_save_execution_BANG___50136.invoke(query_processor.clj:245)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180$fn__50183.invoke(query_processor.clj:265)" "query_processor$fn__50171$process_query_and_save_with_max_results_constraints_BANG___50180.invoke(query_processor.clj:258)" "api.dataset$run_query_async$fn__56418.invoke(dataset.clj:56)" "query_processor.streaming$streaming_response_STAR_$fn__56397$fn__56398.invoke(streaming.clj:72)" "query_processor.streaming$streaming_response_STAR_$fn__56397.invoke(streaming.clj:71)" "async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)" "async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)" "async.streaming_response$do_f_async$fn__16088.invoke(streaming_response.clj:84)"], :context :ad-hoc, :error "SQL compilation error:\nsyntax error line 30 at position 112 unexpected ')'.", :row_count 0, :running_time 0, :data {:rows [], :cols []}}` **To Reproduce** Steps to reproduce the behavior: 1. Create a Metrics with capital letters in the name 2. Use this metric in a question 3. Sort on this metric **Metabase Diagnostic Info** ```json { "browser-info": { "language": "fr-FR", "platform": "MacIntel", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.61 Safari/537.36", "vendor": "Google Inc." }, "system-info": { "file.encoding": "UTF-8", "java.runtime.name": "OpenJDK Runtime Environment", "java.runtime.version": "11.0.12+7", "java.vendor": "Eclipse Foundation", "java.vendor.url": "https://adoptium.net/", "java.version": "11.0.12", "java.vm.name": "OpenJDK 64-Bit Server VM", "java.vm.version": "11.0.12+7", "os.name": "Linux", "os.version": "4.19.0-16-cloud-amd64", "user.language": "en", "user.timezone": "GMT" }, "metabase-info": { "databases": [ "snowflake" ], "hosting-env": "unknown", "application-database": "h2", "application-database-details": { "database": { "name": "H2", "version": "1.4.197 (2018-03-18)" }, "jdbc-driver": { "name": "H2 JDBC Driver", "version": "1.4.197 (2018-03-18)" } }, "run-mode": "prod", "version": { "date": "2021-09-09", "tag": "v0.40.4", "branch": "release-x.40.x", "hash": "16d2e53" }, "settings": { "report-timezone": "Europe/Paris" } } } ```
process
field alias different for metric and sort causing invalid query on snowflake describe the bug when i want to sort my question by a metric defined in the data model metrics i have this error sql compilation error error line at position invalid identifier direct influenced open rate my metrics has capital letters direct influenced open rate the sql request sent to my database snowflake does not include the capital letters in the select statement but include the capital letters in the sort clause causing the invalid identifier issue here the request received by snowflake from metabase select date trunc quarter cast batch public fact daily campaigns stat date as timestamp as stat date batch public fact daily campaigns company name as company name batch public fact daily campaigns app name as app name batch public fact daily campaigns platform as platform batch public fact daily campaigns campaign type as campaign type cast sum batch public fact daily campaigns direct open batch public fact daily campaigns influenced open as float case when sum batch public fact daily campaigns optin sent then null else sum batch public fact daily campaigns optin sent end as direct influenced open rate cast sum batch public fact daily campaigns direct open as float case when sum batch public fact daily campaigns optin sent then null else sum batch public fact daily campaigns optin sent end as direct open rate from batch public fact daily campaigns where batch public fact daily campaigns company name and batch public fact daily campaigns stat date and batch public fact daily campaigns stat date group by date trunc quarter cast batch public fact daily campaigns stat date as timestamp batch public fact daily campaigns company name batch public fact daily campaigns app name batch public fact daily campaigns platform batch public fact daily campaigns campaign type order by direct influenced open rate desc date trunc quarter cast batch public fact daily campaigns stat date as timestamp asc batch public fact daily campaigns company name asc batch public fact daily campaigns app name asc batch public fact daily campaigns platform asc batch public fact daily campaigns campaign type asc logs debug metabase server middleware log get api database autocomplete suggestions ms db calls app db connections jetty threads idle queued total active threads queries in flight queued error metabase query processor middleware catch exceptions error processing query null database id started at t via status failed class clojure lang exceptioninfo error error executing query stacktrace driver sql jdbc execute execute reducible query fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj error type invalid query ex data sql metabase userid querytype native queryhash nwith daily as n select n date trunc quarter stat date as quarter n app name n company name n platform n campaign type n coalesce sum direct open as direct open n coalesce sum influenced open as influenced open n coalesce sum optin sent as optin sent n from fact daily campaigns n where true n n and batch public fact daily campaigns company name in n n n group by n n nselect nquarter n app name n company name n platform n campaign type n direct open optin sent as direct open rate n direct open influenced open optin sent as direct influenced open rate nfrom daily nwhere optin sent nqualify row number over partition by quarter app name company name platform campaign type order by params type invalid query state error type invalid query json query database native template tags app name id name app name display name app name type dimension dimension widget type category company name id name company name display name company name type dimension dimension widget type category platform id name platform display name platform type dimension dimension widget type category campaign type id name campaign type display name campaign type type dimension dimension widget type category rank type id name rank type display name rank type type text required true default direct open rate rank id name rank display name rank type text required true default query with daily as n select n date trunc quarter stat date as quarter n app name n company name n platform n campaign type n coalesce sum direct open as direct open n coalesce sum influenced open as influenced open n coalesce sum optin sent as optin sent n from fact daily campaigns n where true n n n n n group by n n nselect nquarter n app name n company name n platform n campaign type n direct open optin sent as direct open rate n direct open influenced open optin sent as direct influenced open rate nfrom daily nwhere optin sent nqualify row number over partition by quarter app name company name platform campaign type order by rank type rank type native parameters target middleware js int to string true add default userland constraints true status failed class net snowflake client jdbc snowflakesqlexception stacktrace net snowflake client jdbc snowflakeutil checkerrorandthrowexceptionsub snowflakeutil java net snowflake client jdbc snowflakeutil checkerrorandthrowexception snowflakeutil java net snowflake client core stmtutil pollforoutput stmtutil java net snowflake client core stmtutil execute stmtutil java net snowflake client core sfstatement executehelper sfstatement java net snowflake client core sfstatement executequeryinternal sfstatement java net snowflake client core sfstatement executequery sfstatement java net snowflake client core sfstatement execute sfstatement java net snowflake client jdbc executequeryinternal java net snowflake client jdbc executequery java com mchange impl newproxypreparedstatement executequery newproxypreparedstatement java driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute execute statement or prepared statement bang invokestatic execute clj driver sql jdbc execute execute statement or prepared statement bang invoke execute clj driver sql jdbc execute execute reducible query fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error sql compilation error nsyntax error line at position unexpected row count running time data rows cols to reproduce steps to reproduce the behavior create a metrics with capital letters in the name use this metric in a question sort on this metric metabase diagnostic info json browser info language fr fr platform macintel useragent mozilla macintosh intel mac os x applewebkit khtml like gecko chrome safari vendor google inc system info file encoding utf java runtime name openjdk runtime environment java runtime version java vendor eclipse foundation java vendor url java version java vm name openjdk bit server vm java vm version os name linux os version cloud user language en user timezone gmt metabase info databases snowflake hosting env unknown application database application database details database name version jdbc driver name jdbc driver version run mode prod version date tag branch release x x hash settings report timezone europe paris
1
68,367
13,123,818,761
IssuesEvent
2020-08-06 01:48:08
DS-13-Dev-Team/DS13
https://api.github.com/repos/DS-13-Dev-Team/DS13
closed
Runtime error when joining as crew
Bug Cannot Reproduce Code
runtime error: undefined proc or verb /mob/living/silicon/ai/get assignment(). proc name: hear radio (/mob/proc/hear_radio) source file: hear_say.dm,150 usr: the new player (/mob/new_player) src: (/mob/living/carbon/human) usr.loc: null src.loc: the deck (163,121,1) (/turf/simulated/floor/tiled/white) call stack: (/mob/living/carbon/human): hear radio("Nehnahblahblah Nehblahnah, Sur...", "states", null, "<span style=\'color: #009190\'...", " <span class=\'message\...", "", Arrivals Announcement Computer (/mob/living/silicon/ai), 0, "Arrivals Announcement Computer") Broadcast Message(/datum/radio_frequency (/datum/radio_frequency), Arrivals Announcement Computer (/mob/living/silicon/ai), 0, "says", the shortwave radio (/obj/item/device/radio/announcer), "Nehnahblahblah Nehblahnah, Sur...", "Arrivals Announcement Computer", "AI", "Arrivals Announcement Computer", "synthesized voice", null, 0, /list (/list), 1355, "states", null, "Medical", "#009190") Subspace Broadcaster (/obj/machinery/telecomms/broadcaster/preset_right): receive information(/datum/signal (/datum/signal), Telecommunication Hub (/obj/machinery/telecomms/hub/preset)) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/broad... (/obj/machinery/telecomms/broadcaster), 1, 20) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): receive information(/datum/signal (/datum/signal), Telecommunication Server (/obj/machinery/telecomms/server/presets/medical)) Telecommunication Server (/obj/machinery/telecomms/server/presets/medical): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/hub (/obj/machinery/telecomms/hub), null, 20) Telecommunication Server (/obj/machinery/telecomms/server/presets/medical): receive information(/datum/signal (/datum/signal), Bus Mainframe (/obj/machinery/telecomms/bus/preset_one)) Bus Mainframe (/obj/machinery/telecomms/bus/preset_one): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/serve... (/obj/machinery/telecomms/server), null, 20) Bus Mainframe (/obj/machinery/telecomms/bus/preset_one): receive information(/datum/signal (/datum/signal), Processor Unit (/obj/machinery/telecomms/processor/preset_one)) Processor Unit (/obj/machinery/telecomms/processor/preset_one): relay direct information(/datum/signal (/datum/signal), Bus Mainframe (/obj/machinery/telecomms/bus/preset_one)) ... Telecommunication Hub (/obj/machinery/telecomms/hub/preset): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/bus (/obj/machinery/telecomms/bus), 1, 20) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): receive information(/datum/signal (/datum/signal), Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right)) Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/hub (/obj/machinery/telecomms/hub), null, 20) Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right): receive signal(/datum/signal (/datum/signal)) the shortwave radio (/obj/item/device/radio/announcer): talk into(Arrivals Announcement Computer (/mob/living/silicon/ai), "Nehnahblahblah Nehblahnah, Sur...", "Medical", "states", null) the shortwave radio (/obj/item/device/radio/announcer): autosay("Nehnahblahblah Nehblahnah, Sur...", "Arrivals Announcement Computer", "Medical") AnnounceArrivalSimple("Nehnahblahblah Nehblahnah", "Surgeon", "has completed cryogenic reviva...", "Medical") the new player (/mob/new_player): AttemptLateSpawn(/datum/job/surg (/datum/job/surg), "Default") NanakoAC (/client): Topic("src=\[0x3000011];SelectedJob=S...", /list (/list), the new player (/mob/new_player)) NanakoAC (/client): Topic("src=\[0x3000011];SelectedJob=S...", /list (/list), the new player (/mob/new_player))
1.0
Runtime error when joining as crew - runtime error: undefined proc or verb /mob/living/silicon/ai/get assignment(). proc name: hear radio (/mob/proc/hear_radio) source file: hear_say.dm,150 usr: the new player (/mob/new_player) src: (/mob/living/carbon/human) usr.loc: null src.loc: the deck (163,121,1) (/turf/simulated/floor/tiled/white) call stack: (/mob/living/carbon/human): hear radio("Nehnahblahblah Nehblahnah, Sur...", "states", null, "<span style=\'color: #009190\'...", " <span class=\'message\...", "", Arrivals Announcement Computer (/mob/living/silicon/ai), 0, "Arrivals Announcement Computer") Broadcast Message(/datum/radio_frequency (/datum/radio_frequency), Arrivals Announcement Computer (/mob/living/silicon/ai), 0, "says", the shortwave radio (/obj/item/device/radio/announcer), "Nehnahblahblah Nehblahnah, Sur...", "Arrivals Announcement Computer", "AI", "Arrivals Announcement Computer", "synthesized voice", null, 0, /list (/list), 1355, "states", null, "Medical", "#009190") Subspace Broadcaster (/obj/machinery/telecomms/broadcaster/preset_right): receive information(/datum/signal (/datum/signal), Telecommunication Hub (/obj/machinery/telecomms/hub/preset)) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/broad... (/obj/machinery/telecomms/broadcaster), 1, 20) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): receive information(/datum/signal (/datum/signal), Telecommunication Server (/obj/machinery/telecomms/server/presets/medical)) Telecommunication Server (/obj/machinery/telecomms/server/presets/medical): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/hub (/obj/machinery/telecomms/hub), null, 20) Telecommunication Server (/obj/machinery/telecomms/server/presets/medical): receive information(/datum/signal (/datum/signal), Bus Mainframe (/obj/machinery/telecomms/bus/preset_one)) Bus Mainframe (/obj/machinery/telecomms/bus/preset_one): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/serve... (/obj/machinery/telecomms/server), null, 20) Bus Mainframe (/obj/machinery/telecomms/bus/preset_one): receive information(/datum/signal (/datum/signal), Processor Unit (/obj/machinery/telecomms/processor/preset_one)) Processor Unit (/obj/machinery/telecomms/processor/preset_one): relay direct information(/datum/signal (/datum/signal), Bus Mainframe (/obj/machinery/telecomms/bus/preset_one)) ... Telecommunication Hub (/obj/machinery/telecomms/hub/preset): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/bus (/obj/machinery/telecomms/bus), 1, 20) Telecommunication Hub (/obj/machinery/telecomms/hub/preset): receive information(/datum/signal (/datum/signal), Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right)) Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right): relay information(/datum/signal (/datum/signal), /obj/machinery/telecomms/hub (/obj/machinery/telecomms/hub), null, 20) Subspace Receiver (/obj/machinery/telecomms/receiver/preset_right): receive signal(/datum/signal (/datum/signal)) the shortwave radio (/obj/item/device/radio/announcer): talk into(Arrivals Announcement Computer (/mob/living/silicon/ai), "Nehnahblahblah Nehblahnah, Sur...", "Medical", "states", null) the shortwave radio (/obj/item/device/radio/announcer): autosay("Nehnahblahblah Nehblahnah, Sur...", "Arrivals Announcement Computer", "Medical") AnnounceArrivalSimple("Nehnahblahblah Nehblahnah", "Surgeon", "has completed cryogenic reviva...", "Medical") the new player (/mob/new_player): AttemptLateSpawn(/datum/job/surg (/datum/job/surg), "Default") NanakoAC (/client): Topic("src=\[0x3000011];SelectedJob=S...", /list (/list), the new player (/mob/new_player)) NanakoAC (/client): Topic("src=\[0x3000011];SelectedJob=S...", /list (/list), the new player (/mob/new_player))
non_process
runtime error when joining as crew runtime error undefined proc or verb mob living silicon ai get assignment proc name hear radio mob proc hear radio source file hear say dm usr the new player mob new player src mob living carbon human usr loc null src loc the deck turf simulated floor tiled white call stack mob living carbon human hear radio nehnahblahblah nehblahnah sur states null span style color span class message arrivals announcement computer mob living silicon ai arrivals announcement computer broadcast message datum radio frequency datum radio frequency arrivals announcement computer mob living silicon ai says the shortwave radio obj item device radio announcer nehnahblahblah nehblahnah sur arrivals announcement computer ai arrivals announcement computer synthesized voice null list list states null medical subspace broadcaster obj machinery telecomms broadcaster preset right receive information datum signal datum signal telecommunication hub obj machinery telecomms hub preset telecommunication hub obj machinery telecomms hub preset relay information datum signal datum signal obj machinery telecomms broad obj machinery telecomms broadcaster telecommunication hub obj machinery telecomms hub preset receive information datum signal datum signal telecommunication server obj machinery telecomms server presets medical telecommunication server obj machinery telecomms server presets medical relay information datum signal datum signal obj machinery telecomms hub obj machinery telecomms hub null telecommunication server obj machinery telecomms server presets medical receive information datum signal datum signal bus mainframe obj machinery telecomms bus preset one bus mainframe obj machinery telecomms bus preset one relay information datum signal datum signal obj machinery telecomms serve obj machinery telecomms server null bus mainframe obj machinery telecomms bus preset one receive information datum signal datum signal processor unit obj machinery telecomms processor preset one processor unit obj machinery telecomms processor preset one relay direct information datum signal datum signal bus mainframe obj machinery telecomms bus preset one telecommunication hub obj machinery telecomms hub preset relay information datum signal datum signal obj machinery telecomms bus obj machinery telecomms bus telecommunication hub obj machinery telecomms hub preset receive information datum signal datum signal subspace receiver obj machinery telecomms receiver preset right subspace receiver obj machinery telecomms receiver preset right relay information datum signal datum signal obj machinery telecomms hub obj machinery telecomms hub null subspace receiver obj machinery telecomms receiver preset right receive signal datum signal datum signal the shortwave radio obj item device radio announcer talk into arrivals announcement computer mob living silicon ai nehnahblahblah nehblahnah sur medical states null the shortwave radio obj item device radio announcer autosay nehnahblahblah nehblahnah sur arrivals announcement computer medical announcearrivalsimple nehnahblahblah nehblahnah surgeon has completed cryogenic reviva medical the new player mob new player attemptlatespawn datum job surg datum job surg default nanakoac client topic src selectedjob s list list the new player mob new player nanakoac client topic src selectedjob s list list the new player mob new player
0
397,005
27,145,142,185
IssuesEvent
2023-02-16 19:20:48
aws/aws-sdk-js-v3
https://api.github.com/repos/aws/aws-sdk-js-v3
closed
The ChallengeResponses of AdminRespondToAuthChallenge doesn't have options for CUSTOM_CHALLENGE
response-requested documentation
### Describe the issue When viewing the shape of **ChallengeResponses** in [AdminRespondToAuthChallengeCommandInput](https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-cognito-identity-provider/interfaces/adminrespondtoauthchallengecommandinput.html), it says that its value depend on the **ChallengeName** and proceeds to list down all the values for each challenge... However the challenge **CUSTOM_CHALLENGE** isn't written in there. What should be the **ChallengeResponses** for the **CUSTOM_CHALLENGE**? ### Links - https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-cognito-identity-provider/interfaces/adminrespondtoauthchallengecommandinput.html#challengeresponses
1.0
The ChallengeResponses of AdminRespondToAuthChallenge doesn't have options for CUSTOM_CHALLENGE - ### Describe the issue When viewing the shape of **ChallengeResponses** in [AdminRespondToAuthChallengeCommandInput](https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-cognito-identity-provider/interfaces/adminrespondtoauthchallengecommandinput.html), it says that its value depend on the **ChallengeName** and proceeds to list down all the values for each challenge... However the challenge **CUSTOM_CHALLENGE** isn't written in there. What should be the **ChallengeResponses** for the **CUSTOM_CHALLENGE**? ### Links - https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-cognito-identity-provider/interfaces/adminrespondtoauthchallengecommandinput.html#challengeresponses
non_process
the challengeresponses of adminrespondtoauthchallenge doesn t have options for custom challenge describe the issue when viewing the shape of challengeresponses in it says that its value depend on the challengename and proceeds to list down all the values for each challenge however the challenge custom challenge isn t written in there what should be the challengeresponses for the custom challenge links
0
2,894
5,873,089,500
IssuesEvent
2017-05-15 13:16:29
SpongePowered/Mixin
https://api.github.com/repos/SpongePowered/Mixin
closed
`remap = false` on `@Mixin` annotation doesn't seem to affect remap in `@At`
annotation processor bug minor
I get [bunch of warnings](https://paste.nightsnack.cf/oyakaxuvip.rb) about missing mappings for `@At`, even though I have `remap=false` for `@Mixin` annotation. Not sure if that is intended. (Some) [Code is here](https://paste.nightsnack.cf/iwiwucofod.java), if anyone wants to take a closer look.
1.0
`remap = false` on `@Mixin` annotation doesn't seem to affect remap in `@At` - I get [bunch of warnings](https://paste.nightsnack.cf/oyakaxuvip.rb) about missing mappings for `@At`, even though I have `remap=false` for `@Mixin` annotation. Not sure if that is intended. (Some) [Code is here](https://paste.nightsnack.cf/iwiwucofod.java), if anyone wants to take a closer look.
process
remap false on mixin annotation doesn t seem to affect remap in at i get about missing mappings for at even though i have remap false for mixin annotation not sure if that is intended some if anyone wants to take a closer look
1
117,863
9,962,139,110
IssuesEvent
2019-07-07 12:03:02
132nd-vWing/TRM
https://api.github.com/repos/132nd-vWing/TRM
closed
ARCO returning through MOA West
completed testing requested
When ARCO are replaced and goes RTB, he takes shortest route back to base, which takes him through MOA WEST. Would it be possible to change so he follows his original flightpath(while flying into the area) back to base?
1.0
ARCO returning through MOA West - When ARCO are replaced and goes RTB, he takes shortest route back to base, which takes him through MOA WEST. Would it be possible to change so he follows his original flightpath(while flying into the area) back to base?
non_process
arco returning through moa west when arco are replaced and goes rtb he takes shortest route back to base which takes him through moa west would it be possible to change so he follows his original flightpath while flying into the area back to base
0
35,905
6,507,039,555
IssuesEvent
2017-08-24 11:36:22
chauncy-crib/tagprobot
https://api.github.com/repos/chauncy-crib/tagprobot
closed
Start each game with a chat message that prints our available commands
documentation new feature
We currently only have Q and V, but I already forget them all the time.
1.0
Start each game with a chat message that prints our available commands - We currently only have Q and V, but I already forget them all the time.
non_process
start each game with a chat message that prints our available commands we currently only have q and v but i already forget them all the time
0
1,580
4,174,682,341
IssuesEvent
2016-06-21 14:44:32
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
opened
Can't find trial by searching for id M05-775
bug Data cleaning Processors
We can find it on ClinicalTrials.gov (https://clinicaltrials.gov/ct2/results?term=M05-775), but can't on OpenTrials (http://explorer.opentrials.net/search?q=M05-775). This trial exists on OpenTrials (http://explorer.opentrials.net/trials/d4487ef3-09bb-40c4-afed-46afbab310a6), we just don't have the `M05-775` study id.
1.0
Can't find trial by searching for id M05-775 - We can find it on ClinicalTrials.gov (https://clinicaltrials.gov/ct2/results?term=M05-775), but can't on OpenTrials (http://explorer.opentrials.net/search?q=M05-775). This trial exists on OpenTrials (http://explorer.opentrials.net/trials/d4487ef3-09bb-40c4-afed-46afbab310a6), we just don't have the `M05-775` study id.
process
can t find trial by searching for id we can find it on clinicaltrials gov but can t on opentrials this trial exists on opentrials we just don t have the study id
1
418,202
12,194,635,920
IssuesEvent
2020-04-29 16:05:44
11ty/eleventy
https://api.github.com/repos/11ty/eleventy
closed
Computed data not computed in the necessary order
bug high-priority
**Describe the bug** Hello, I'm trying to use Eleventy for a multilingual site, using file extensions to set the locale (rather than a separate folder). I need to set 3 properties in this order: 1. the `locale` read from the file path 2. a `key` to match translated content 3. the `permalink` derived from the `key` and the `locale` The computation of the data happens in the wrong order and the `permalink` gets computed before the `key`, leading to `undefined` being set in the path. **To Reproduce** I've created a repo to reproduce : https://github.com/rhumaric/eleventy-computed-property-order-issue If you run `npx @11ty/eleventy`, you'll notice the 'generating permalink' happens before the 'Setting key test'. There's also some weird logs happening ahead of it, similar to those on #971 **Expected behavior** I'd have expected Eleventy to detect that `permalink` depended on the `key`, as announced in the documentation. **Environment:** - OS and Version: Ubuntu 18.04 - Eleventy Version 0.11.0-beta.1
1.0
Computed data not computed in the necessary order - **Describe the bug** Hello, I'm trying to use Eleventy for a multilingual site, using file extensions to set the locale (rather than a separate folder). I need to set 3 properties in this order: 1. the `locale` read from the file path 2. a `key` to match translated content 3. the `permalink` derived from the `key` and the `locale` The computation of the data happens in the wrong order and the `permalink` gets computed before the `key`, leading to `undefined` being set in the path. **To Reproduce** I've created a repo to reproduce : https://github.com/rhumaric/eleventy-computed-property-order-issue If you run `npx @11ty/eleventy`, you'll notice the 'generating permalink' happens before the 'Setting key test'. There's also some weird logs happening ahead of it, similar to those on #971 **Expected behavior** I'd have expected Eleventy to detect that `permalink` depended on the `key`, as announced in the documentation. **Environment:** - OS and Version: Ubuntu 18.04 - Eleventy Version 0.11.0-beta.1
non_process
computed data not computed in the necessary order describe the bug hello i m trying to use eleventy for a multilingual site using file extensions to set the locale rather than a separate folder i need to set properties in this order the locale read from the file path a key to match translated content the permalink derived from the key and the locale the computation of the data happens in the wrong order and the permalink gets computed before the key leading to undefined being set in the path to reproduce i ve created a repo to reproduce if you run npx eleventy you ll notice the generating permalink happens before the setting key test there s also some weird logs happening ahead of it similar to those on expected behavior i d have expected eleventy to detect that permalink depended on the key as announced in the documentation environment os and version ubuntu eleventy version beta
0
406,850
11,903,246,296
IssuesEvent
2020-03-30 15:04:26
googleapis/google-cloud-dotnet
https://api.github.com/repos/googleapis/google-cloud-dotnet
closed
Synthesis failed for Google.Cloud.Spanner.V1
autosynth failure priority: p1 type: bug
Hello! Autosynth couldn't regenerate Google.Cloud.Spanner.V1. :broken_heart: Here's the output from running `synth.py`: ``` Cloning into 'working_repo'... Switched to a new branch 'autosynth-Google.Cloud.Spanner.V1' Cloning into '/tmpfs/tmp/tmpky2y34ui/googleapis'... Note: checking out '88316b63a486002727e14032d104690541179fc9'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 88316b63a Generated synth.py files with multiple commits enabled Note: checking out '7be2811ad17013a5ea24cd75dfd9e399dd6e18fe'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 7be2811a fix: Update gapic-generator version to pickup discogapic fixes Switched to a new branch 'autosynth-Google.Cloud.Spanner.V1-2' Running synthtool ['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--'] 2020-03-30 07:28:26,594 synthtool > Executing /tmpfs/src/git/autosynth/working_repo/apis/Google.Cloud.Spanner.V1/synth.py. Contents of preconfig file: {"preclonedRepos": {"https://github.com/googleapis/google-cloud-dotnet.git": "/tmpfs/src/git/autosynth/working_repo", "https://github.com/googleapis/googleapis.git": "/tmpfs/tmp/tmpky2y34ui/googleapis"}}Extracted repo location: /tmpfs/tmp/tmpky2y34ui/googleapis generateapis.sh: line 40: declare: GOOGLEAPIS: readonly variable 2020-03-30 07:28:26,641 synthtool > Failed executing /bin/bash generateapis.sh --check_compatibility Google.Cloud.Spanner.V1: None 2020-03-30 07:28:26,642 synthtool > Wrote metadata to synth.metadata. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/tmpfs/src/git/autosynth/working_repo/apis/Google.Cloud.Spanner.V1/synth.py", line 22, in <module> hide_output = False) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '('/bin/bash', 'generateapis.sh', '--check_compatibility', 'Google.Cloud.Spanner.V1')' returned non-zero exit status 1. Synthesis failed Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 482, in <module> main() File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 334, in main return _inner_main(temp_dir) File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 429, in _inner_main branch, File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 135, in synthesize_loop metadata_path, extra_args, deprecated_execution, environ, synthesize_py_path, File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 278, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. ``` Google internal developers can see the full log [here](https://sponge/e62b0d94-aaf2-4a14-b8f8-8803561a9577).
1.0
Synthesis failed for Google.Cloud.Spanner.V1 - Hello! Autosynth couldn't regenerate Google.Cloud.Spanner.V1. :broken_heart: Here's the output from running `synth.py`: ``` Cloning into 'working_repo'... Switched to a new branch 'autosynth-Google.Cloud.Spanner.V1' Cloning into '/tmpfs/tmp/tmpky2y34ui/googleapis'... Note: checking out '88316b63a486002727e14032d104690541179fc9'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 88316b63a Generated synth.py files with multiple commits enabled Note: checking out '7be2811ad17013a5ea24cd75dfd9e399dd6e18fe'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 7be2811a fix: Update gapic-generator version to pickup discogapic fixes Switched to a new branch 'autosynth-Google.Cloud.Spanner.V1-2' Running synthtool ['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--'] 2020-03-30 07:28:26,594 synthtool > Executing /tmpfs/src/git/autosynth/working_repo/apis/Google.Cloud.Spanner.V1/synth.py. Contents of preconfig file: {"preclonedRepos": {"https://github.com/googleapis/google-cloud-dotnet.git": "/tmpfs/src/git/autosynth/working_repo", "https://github.com/googleapis/googleapis.git": "/tmpfs/tmp/tmpky2y34ui/googleapis"}}Extracted repo location: /tmpfs/tmp/tmpky2y34ui/googleapis generateapis.sh: line 40: declare: GOOGLEAPIS: readonly variable 2020-03-30 07:28:26,641 synthtool > Failed executing /bin/bash generateapis.sh --check_compatibility Google.Cloud.Spanner.V1: None 2020-03-30 07:28:26,642 synthtool > Wrote metadata to synth.metadata. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/tmpfs/src/git/autosynth/working_repo/apis/Google.Cloud.Spanner.V1/synth.py", line 22, in <module> hide_output = False) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '('/bin/bash', 'generateapis.sh', '--check_compatibility', 'Google.Cloud.Spanner.V1')' returned non-zero exit status 1. Synthesis failed Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 482, in <module> main() File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 334, in main return _inner_main(temp_dir) File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 429, in _inner_main branch, File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 135, in synthesize_loop metadata_path, extra_args, deprecated_execution, environ, synthesize_py_path, File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 278, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1. ``` Google internal developers can see the full log [here](https://sponge/e62b0d94-aaf2-4a14-b8f8-8803561a9577).
non_process
synthesis failed for google cloud spanner hello autosynth couldn t regenerate google cloud spanner broken heart here s the output from running synth py cloning into working repo switched to a new branch autosynth google cloud spanner cloning into tmpfs tmp googleapis note checking out you are in detached head state you can look around make experimental changes and commit them and you can discard any commits you make in this state without impacting any branches by performing another checkout if you want to create a new branch to retain commits you create you may do so now or later by using b with the checkout command again example git checkout b head is now at generated synth py files with multiple commits enabled note checking out you are in detached head state you can look around make experimental changes and commit them and you can discard any commits you make in this state without impacting any branches by performing another checkout if you want to create a new branch to retain commits you create you may do so now or later by using b with the checkout command again example git checkout b head is now at fix update gapic generator version to pickup discogapic fixes switched to a new branch autosynth google cloud spanner running synthtool synthtool executing tmpfs src git autosynth working repo apis google cloud spanner synth py contents of preconfig file preclonedrepos tmpfs src git autosynth working repo tmpfs tmp googleapis extracted repo location tmpfs tmp googleapis generateapis sh line declare googleapis readonly variable synthtool failed executing bin bash generateapis sh check compatibility google cloud spanner none synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo apis google cloud spanner synth py line in hide output false file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command bin bash generateapis sh check compatibility google cloud spanner returned non zero exit status synthesis failed traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth autosynth synth py line in main file tmpfs src git autosynth autosynth synth py line in main return inner main temp dir file tmpfs src git autosynth autosynth synth py line in inner main branch file tmpfs src git autosynth autosynth synth py line in synthesize loop metadata path extra args deprecated execution environ synthesize py path file tmpfs src git autosynth autosynth synth py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log
0
694,184
23,805,598,177
IssuesEvent
2022-09-04 01:23:59
DevLUp-Inc/Feedback
https://api.github.com/repos/DevLUp-Inc/Feedback
closed
Investigate AWS CDK
Enhancement Low Priority Internal Research
**Notes** Split stateless and stateful resources into separate stacks L1, L2, L3 refers to abstraction levels
1.0
Investigate AWS CDK - **Notes** Split stateless and stateful resources into separate stacks L1, L2, L3 refers to abstraction levels
non_process
investigate aws cdk notes split stateless and stateful resources into separate stacks refers to abstraction levels
0
21,015
27,961,126,719
IssuesEvent
2023-03-24 15:44:03
JoTec2002/TINF21C_AAS_Management
https://api.github.com/repos/JoTec2002/TINF21C_AAS_Management
closed
Administration Management Page
in Process frontend
Creating a rough structure for the page, where the administrator has the opportunity for user management.
1.0
Administration Management Page - Creating a rough structure for the page, where the administrator has the opportunity for user management.
process
administration management page creating a rough structure for the page where the administrator has the opportunity for user management
1
595,009
18,059,184,360
IssuesEvent
2021-09-20 12:09:33
vaadin/flow-legacy-components
https://api.github.com/repos/vaadin/flow-legacy-components
closed
Include the child components added style names to the slot's class name
bug priority: high
After #37, when a component is added to AbstractOrderedLayout, it should check the style names of the child and include them to the slot, like: ```java Label legacyLabel = new Label("foobar"); legacyLabel.addStyleName("my-label"); verticalLayout.addComponent(legacyLabel); ``` ```html ... <div class="v-slot v-slot-my-label"> <div class="v-label v-widget my-label v-label-my-label v-label-undef-w">three</div> </div> ``` This will only work with legacy components and their style names - not with Flow components with class names. Implementation could be done so that the style name API in abstract component marks the parent as dirty when style name has changed. Or then just the layout listens for the property changes (or style name) in its children. Whatever feels less bad.
1.0
Include the child components added style names to the slot's class name - After #37, when a component is added to AbstractOrderedLayout, it should check the style names of the child and include them to the slot, like: ```java Label legacyLabel = new Label("foobar"); legacyLabel.addStyleName("my-label"); verticalLayout.addComponent(legacyLabel); ``` ```html ... <div class="v-slot v-slot-my-label"> <div class="v-label v-widget my-label v-label-my-label v-label-undef-w">three</div> </div> ``` This will only work with legacy components and their style names - not with Flow components with class names. Implementation could be done so that the style name API in abstract component marks the parent as dirty when style name has changed. Or then just the layout listens for the property changes (or style name) in its children. Whatever feels less bad.
non_process
include the child components added style names to the slot s class name after when a component is added to abstractorderedlayout it should check the style names of the child and include them to the slot like java label legacylabel new label foobar legacylabel addstylename my label verticallayout addcomponent legacylabel html three this will only work with legacy components and their style names not with flow components with class names implementation could be done so that the style name api in abstract component marks the parent as dirty when style name has changed or then just the layout listens for the property changes or style name in its children whatever feels less bad
0
174,353
14,479,354,200
IssuesEvent
2020-12-10 09:41:56
paoloemilioserra/Civil3dToolkit
https://api.github.com/repos/paoloemilioserra/Civil3dToolkit
closed
XML Tags - I/O - AlignmentExtensions.CreateAlignmentByPolyline
documentation
v1.1.15 The "name" input port has some spelling errors. Current: "The Alignment name. If the names is alredy used it returns the existing Alignment." Proposed: "...**name** is **already**..."
1.0
XML Tags - I/O - AlignmentExtensions.CreateAlignmentByPolyline - v1.1.15 The "name" input port has some spelling errors. Current: "The Alignment name. If the names is alredy used it returns the existing Alignment." Proposed: "...**name** is **already**..."
non_process
xml tags i o alignmentextensions createalignmentbypolyline the name input port has some spelling errors current the alignment name if the names is alredy used it returns the existing alignment proposed name is already
0
181,976
21,664,471,516
IssuesEvent
2022-05-07 01:27:42
scottstientjes/snipe-it
https://api.github.com/repos/scottstientjes/snipe-it
closed
WS-2020-0070 (High) detected in lodash-4.17.5.tgz - autoclosed
security vulnerability
## WS-2020-0070 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/snipe-it/package.json</p> <p>Path to vulnerable library: /snipe-it/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - :x: **lodash-4.17.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scottstientjes/snipe-it/commit/35f2b36393de933b01f7dd715958a7a89a2d783b">35f2b36393de933b01f7dd715958a7a89a2d783b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> a prototype pollution vulnerability in lodash. It allows an attacker to inject properties on Object.prototype <p>Publish Date: 2020-04-28 <p>URL: <a href=https://hackerone.com/reports/712065>WS-2020-0070</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2020-0070 (High) detected in lodash-4.17.5.tgz - autoclosed - ## WS-2020-0070 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/snipe-it/package.json</p> <p>Path to vulnerable library: /snipe-it/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - :x: **lodash-4.17.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scottstientjes/snipe-it/commit/35f2b36393de933b01f7dd715958a7a89a2d783b">35f2b36393de933b01f7dd715958a7a89a2d783b</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> a prototype pollution vulnerability in lodash. It allows an attacker to inject properties on Object.prototype <p>Publish Date: 2020-04-28 <p>URL: <a href=https://hackerone.com/reports/712065>WS-2020-0070</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws high detected in lodash tgz autoclosed ws high severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file tmp ws scm snipe it package json path to vulnerable library snipe it node modules lodash package json dependency hierarchy x lodash tgz vulnerable library found in head commit a href vulnerability details a prototype pollution vulnerability in lodash it allows an attacker to inject properties on object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with whitesource
0
20,362
27,020,693,533
IssuesEvent
2023-02-11 01:26:50
MikaylaFischler/cc-mek-scada
https://api.github.com/repos/MikaylaFischler/cc-mek-scada
closed
Process Target Energy Generation
supervisor coordinator process control
Coordinator should provide functionality for the user to simply request a specific target generation rate. It will then attempt to satisfy all input requirements to attain as close to the generation rate as possible. - [x] Calculate initial expected required burn rates based on turbine blade math - [x] Adjust to target with closed loop control - [x] Test control loop logic - [x] Tune gains - [x] Make blade mismatch check for generation rate mode only
1.0
Process Target Energy Generation - Coordinator should provide functionality for the user to simply request a specific target generation rate. It will then attempt to satisfy all input requirements to attain as close to the generation rate as possible. - [x] Calculate initial expected required burn rates based on turbine blade math - [x] Adjust to target with closed loop control - [x] Test control loop logic - [x] Tune gains - [x] Make blade mismatch check for generation rate mode only
process
process target energy generation coordinator should provide functionality for the user to simply request a specific target generation rate it will then attempt to satisfy all input requirements to attain as close to the generation rate as possible calculate initial expected required burn rates based on turbine blade math adjust to target with closed loop control test control loop logic tune gains make blade mismatch check for generation rate mode only
1
20,537
27,191,856,381
IssuesEvent
2023-02-19 22:02:07
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add Ello Gov'nor from Regular Show
suggested title in process
Please add as much of the following info as you can: Title: Ello Gov'nor Type (film/tv show): film Film or show in which it appears: Regular Show - Season 2 - Episode 1: Ello Gov'nor Is the parent film/show streaming anywhere? HBO Max About when in the parent film/show does it appear? 01:06 Actual footage of the film/show can be seen (yes/no)? yes
1.0
Add Ello Gov'nor from Regular Show - Please add as much of the following info as you can: Title: Ello Gov'nor Type (film/tv show): film Film or show in which it appears: Regular Show - Season 2 - Episode 1: Ello Gov'nor Is the parent film/show streaming anywhere? HBO Max About when in the parent film/show does it appear? 01:06 Actual footage of the film/show can be seen (yes/no)? yes
process
add ello gov nor from regular show please add as much of the following info as you can title ello gov nor type film tv show film film or show in which it appears regular show season episode ello gov nor is the parent film show streaming anywhere hbo max about when in the parent film show does it appear actual footage of the film show can be seen yes no yes
1
19,277
25,464,954,689
IssuesEvent
2022-11-25 02:31:59
openxla/stablehlo
https://api.github.com/repos/openxla/stablehlo
opened
Achieve obvious correctness of spec implementation
Process
As suggested by the title, this is a pretty aspirational ticket with the overall goal of making it as easy as possible for humans to map individual parts of the spec on the individual parts of the implementation, the finer grained the better.
1.0
Achieve obvious correctness of spec implementation - As suggested by the title, this is a pretty aspirational ticket with the overall goal of making it as easy as possible for humans to map individual parts of the spec on the individual parts of the implementation, the finer grained the better.
process
achieve obvious correctness of spec implementation as suggested by the title this is a pretty aspirational ticket with the overall goal of making it as easy as possible for humans to map individual parts of the spec on the individual parts of the implementation the finer grained the better
1
201,527
7,032,821,437
IssuesEvent
2017-12-27 07:01:19
redsunservers/LoadoutBugTracker
https://api.github.com/repos/redsunservers/LoadoutBugTracker
closed
MOTD is extremely inconsistent and not completing ajax calls properly.
priority:low source:website type:bug type:inconsistensy
Valve loves the MOTD so much they make it behave like a 1/3rd of a web browser, causing many inconsistenties with ajax calls thru the MOTD for stuff like gifting, equipping, etc. **Solution:** Somehow make it recognize the MOTD in the javascript code or apply a patch to make sure that MOTD users are crippled off using the ajax functions through the MOTD and encourage them to use the regular browser instead. thanks valve 😄
1.0
MOTD is extremely inconsistent and not completing ajax calls properly. - Valve loves the MOTD so much they make it behave like a 1/3rd of a web browser, causing many inconsistenties with ajax calls thru the MOTD for stuff like gifting, equipping, etc. **Solution:** Somehow make it recognize the MOTD in the javascript code or apply a patch to make sure that MOTD users are crippled off using the ajax functions through the MOTD and encourage them to use the regular browser instead. thanks valve 😄
non_process
motd is extremely inconsistent and not completing ajax calls properly valve loves the motd so much they make it behave like a of a web browser causing many inconsistenties with ajax calls thru the motd for stuff like gifting equipping etc solution somehow make it recognize the motd in the javascript code or apply a patch to make sure that motd users are crippled off using the ajax functions through the motd and encourage them to use the regular browser instead thanks valve 😄
0
784,963
27,591,134,898
IssuesEvent
2023-03-09 00:31:55
status-im/status-desktop
https://api.github.com/repos/status-im/status-desktop
opened
Wallet: Active account changes when clicking send transaction from a watched account
bug priority 2: medium E:Bugfixes E:Wallet
# Bug Report ## Description After clicking send transaction button from a transaction info screen (transaction selected from activity tab) the active account will change from the watched account to the status default account. Closing the modal results in the transaction info being blank as the account has changed. https://user-images.githubusercontent.com/50769329/223883857-74a7423d-e58d-4a51-85dc-09e00a359f5d.mp4 ## Steps to reproduce Add watch account Select a transaction from the activity tab Select send transaction Close send modal #### Expected behavior If the send transaction modal is closed then the user returns to the same account and transaction that was open #### Actual behavior The active account is changed unexpectedly from the watched account to the Status default account. ### Additional Information - Status desktop version: Master - Operating System: Mac
1.0
Wallet: Active account changes when clicking send transaction from a watched account - # Bug Report ## Description After clicking send transaction button from a transaction info screen (transaction selected from activity tab) the active account will change from the watched account to the status default account. Closing the modal results in the transaction info being blank as the account has changed. https://user-images.githubusercontent.com/50769329/223883857-74a7423d-e58d-4a51-85dc-09e00a359f5d.mp4 ## Steps to reproduce Add watch account Select a transaction from the activity tab Select send transaction Close send modal #### Expected behavior If the send transaction modal is closed then the user returns to the same account and transaction that was open #### Actual behavior The active account is changed unexpectedly from the watched account to the Status default account. ### Additional Information - Status desktop version: Master - Operating System: Mac
non_process
wallet active account changes when clicking send transaction from a watched account bug report description after clicking send transaction button from a transaction info screen transaction selected from activity tab the active account will change from the watched account to the status default account closing the modal results in the transaction info being blank as the account has changed steps to reproduce add watch account select a transaction from the activity tab select send transaction close send modal expected behavior if the send transaction modal is closed then the user returns to the same account and transaction that was open actual behavior the active account is changed unexpectedly from the watched account to the status default account additional information status desktop version master operating system mac
0
16,004
20,188,209,308
IssuesEvent
2022-02-11 01:18:12
savitamittalmsft/WAS-SEC-TEST
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
opened
Prohibit direct internet access of virtual machines with policy, logging, and monitoring
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Networking & Connectivity Endpoints
<a href="https://docs.microsoft.com/azure/architecture/framework/Security/governance#remove-virtual-machine-vm-direct-internet-connectivity">Prohibit direct internet access of virtual machines with policy, logging, and monitoring</a> <p><b>Why Consider This?</b></p> Attackers constantly scan public cloud IP ranges for open management ports and attempt to exploit weak credentials ('password spray') and unpatched vulnerabilities in management protocols like SSH and RDP. Preventing direct Internet access to VMs stops a misconfiguration or oversight becoming more serious. <p><b>Context</b></p> <p><b>Suggested Actions</b></p> <p><span>Organizations should attempt to ensure policy and processes require restricting and monitoring direct internet connectivity by virtual machines.</span></p> <p><b>Learn More</b></p> <p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/Security/governance#remove-virtual-machine-vm-direct-internet-connectivity" target="_blank"><span>Remove Virtual Machine (VM) direct internet connectivity</span></a><span /></p>
1.0
Prohibit direct internet access of virtual machines with policy, logging, and monitoring - <a href="https://docs.microsoft.com/azure/architecture/framework/Security/governance#remove-virtual-machine-vm-direct-internet-connectivity">Prohibit direct internet access of virtual machines with policy, logging, and monitoring</a> <p><b>Why Consider This?</b></p> Attackers constantly scan public cloud IP ranges for open management ports and attempt to exploit weak credentials ('password spray') and unpatched vulnerabilities in management protocols like SSH and RDP. Preventing direct Internet access to VMs stops a misconfiguration or oversight becoming more serious. <p><b>Context</b></p> <p><b>Suggested Actions</b></p> <p><span>Organizations should attempt to ensure policy and processes require restricting and monitoring direct internet connectivity by virtual machines.</span></p> <p><b>Learn More</b></p> <p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/Security/governance#remove-virtual-machine-vm-direct-internet-connectivity" target="_blank"><span>Remove Virtual Machine (VM) direct internet connectivity</span></a><span /></p>
process
prohibit direct internet access of virtual machines with policy logging and monitoring why consider this attackers constantly scan public cloud ip ranges for open management ports and attempt to exploit weak credentials password spray and unpatched vulnerabilities in management protocols like ssh and rdp preventing direct internet access to vms stops a misconfiguration or oversight becoming more serious context suggested actions organizations should attempt to ensure policy and processes require restricting and monitoring direct internet connectivity by virtual machines learn more remove virtual machine vm direct internet connectivity
1
12,384
14,900,120,197
IssuesEvent
2021-01-21 15:04:36
googleapis/gapic-generator-go
https://api.github.com/repos/googleapis/gapic-generator-go
closed
Action Required: Fix Renovate Configuration
type: process
There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved. Error type: undefined. Note: this is a *nested* preset so please contact the preset author if you are unable to fix it yourself.
1.0
Action Required: Fix Renovate Configuration - There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved. Error type: undefined. Note: this is a *nested* preset so please contact the preset author if you are unable to fix it yourself.
process
action required fix renovate configuration there is an error with this repository s renovate configuration that needs to be fixed as a precaution renovate will stop prs until it is resolved error type undefined note this is a nested preset so please contact the preset author if you are unable to fix it yourself
1
21,702
30,198,601,117
IssuesEvent
2023-07-05 02:00:10
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Tue, 4 Jul 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events ### A MIL Approach for Anomaly Detection in Surveillance Videos from Multiple Camera Views - **Authors:** Silas Santiago Lopes Pereira, José Everardo Bessa Maia - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00562 - **Pdf link:** https://arxiv.org/pdf/2307.00562 - **Abstract** Occlusion and clutter are two scene states that make it difficult to detect anomalies in surveillance video. Furthermore, anomaly events are rare and, as a consequence, class imbalance and lack of labeled anomaly data are also key features of this task. Therefore, weakly supervised methods are heavily researched for this application. In this paper, we tackle these typical problems of anomaly detection in surveillance video by combining Multiple Instance Learning (MIL) to deal with the lack of labels and Multiple Camera Views (MC) to reduce occlusion and clutter effects. In the resulting MC-MIL algorithm we apply a multiple camera combined loss function to train a regression network with Sultani's MIL ranking function. To evaluate the MC-MIL algorithm first proposed here, the multiple camera PETS-2009 benchmark dataset was re-labeled for the anomaly detection task from multiple camera views. The result shows a significant performance improvement in F1 score compared to the single-camera configuration. ## Keyword: event camera ### Cross-modal Place Recognition in Image Databases using Event-based Sensors - **Authors:** Xiang Ji, Jiaxin Wei, Yifu Wang, Huiliang Shang, Laurent Kneip - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.01047 - **Pdf link:** https://arxiv.org/pdf/2307.01047 - **Abstract** Visual place recognition is an important problem towards global localization in many robotics tasks. One of the biggest challenges is that it may suffer from illumination or appearance changes in surrounding environments. Event cameras are interesting alternatives to frame-based sensors as their high dynamic range enables robust perception in difficult illumination conditions. However, current event-based place recognition methods only rely on event information, which restricts downstream applications of VPR. In this paper, we present the first cross-modal visual place recognition framework that is capable of retrieving regular images from a database given an event query. Our method demonstrates promising results with respect to the state-of-the-art frame-based and event-based methods on the Brisbane-Event-VPR dataset under different scenarios. We also verify the effectiveness of the combination of retrieval and classification, which can boost performance by a large margin. ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Brightness-Restricted Adversarial Attack Patch - **Authors:** Mingzhen Shao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00421 - **Pdf link:** https://arxiv.org/pdf/2307.00421 - **Abstract** Adversarial attack patches have gained increasing attention due to their practical applicability in physical-world scenarios. However, the bright colors used in attack patches represent a significant drawback, as they can be easily identified by human observers. Moreover, even though these attacks have been highly successful in deceiving target networks, which specific features of the attack patch contribute to its success are still unknown. Our paper introduces a brightness-restricted patch (BrPatch) that uses optical characteristics to effectively reduce conspicuousness while preserving image independence. We also conducted an analysis of the impact of various image features (such as color, texture, noise, and size) on the effectiveness of an attack patch in physical-world deployment. Our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise. Based on our findings, we propose some additional methods to further reduce the conspicuousness of BrPatch. Our findings also explain the robustness of attack patches observed in physical-world scenarios. ### ACDMSR: Accelerated Conditional Diffusion Models for Single Image Super-Resolution - **Authors:** Axi Niu, Pham Xuan Trung, Kang Zhang, Jinqiu Sun, Yu Zhu, In So Kweon, Yanning Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00781 - **Pdf link:** https://arxiv.org/pdf/2307.00781 - **Abstract** Diffusion models have gained significant popularity in the field of image-to-image translation. Previous efforts applying diffusion models to image super-resolution (SR) have demonstrated that iteratively refining pure Gaussian noise using a U-Net architecture trained on denoising at various noise levels can yield satisfactory high-resolution images from low-resolution inputs. However, this iterative refinement process comes with the drawback of low inference speed, which strongly limits its applications. To speed up inference and further enhance the performance, our research revisits diffusion models in image super-resolution and proposes a straightforward yet significant diffusion model-based super-resolution method called ACDMSR (accelerated conditional diffusion model for image super-resolution). Specifically, our method adapts the standard diffusion model to perform super-resolution through a deterministic iterative denoising process. Our study also highlights the effectiveness of using a pre-trained SR model to provide the conditional image of the given low-resolution (LR) image to achieve superior high-resolution results. We demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as Set5, Set14, Urban100, BSD100, and Manga109. Moreover, our approach generates more visually realistic counterparts for low-resolution images, emphasizing its effectiveness in practical scenarios. ## Keyword: ISP ### SDRCNN: A single-scale dense residual connected convolutional neural network for pansharpening - **Authors:** Yuan Fang, Yuanzhi Cai, Lei Fan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00327 - **Pdf link:** https://arxiv.org/pdf/2307.00327 - **Abstract** Pansharpening is a process of fusing a high spatial resolution panchromatic image and a low spatial resolution multispectral image to create a high-resolution multispectral image. A novel single-branch, single-scale lightweight convolutional neural network, named SDRCNN, is developed in this study. By using a novel dense residual connected structure and convolution block, SDRCNN achieved a better trade-off between accuracy and efficiency. The performance of SDRCNN was tested using four datasets from the WorldView-3, WorldView-2 and QuickBird satellites. The compared methods include eight traditional methods (i.e., GS, GSA, PRACS, BDSD, SFIM, GLP-CBD, CDIF and LRTCFPan) and five lightweight deep learning methods (i.e., PNN, PanNet, BayesianNet, DMDNet and FusionNet). Based on a visual inspection of the pansharpened images created and the associated absolute residual maps, SDRCNN exhibited least spatial detail blurring and spectral distortion, amongst all the methods considered. The values of the quantitative evaluation metrics were closest to their ideal values when SDRCNN was used. The processing time of SDRCNN was also the shortest among all methods tested. Finally, the effectiveness of each component in the SDRCNN was demonstrated in ablation experiments. All of these confirmed the superiority of SDRCNN. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### A Proximal Algorithm for Network Slimming - **Authors:** Kevin Bui, Fanghui Xue, Fredrick Park, Yingyong Qi, Jack Xin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00684 - **Pdf link:** https://arxiv.org/pdf/2307.00684 - **Abstract** As a popular channel pruning method for convolutional neural networks (CNNs), network slimming (NS) has a three-stage process: (1) it trains a CNN with $\ell_1$ regularization applied to the scaling factors of the batch normalization layers; (2) it removes channels whose scaling factors are below a chosen threshold; and (3) it retrains the pruned model to recover the original accuracy. This time-consuming, three-step process is a result of using subgradient descent to train CNNs. Because subgradient descent does not exactly train CNNs towards sparse, accurate structures, the latter two steps are necessary. Moreover, subgradient descent does not have any convergence guarantee. Therefore, we develop an alternative algorithm called proximal NS. Our proposed algorithm trains CNNs towards sparse, accurate structures, so identifying a scaling factor threshold is unnecessary and fine tuning the pruned CNNs is optional. Using Kurdyka-{\L}ojasiewicz assumptions, we establish global convergence of proximal NS. Lastly, we validate the efficacy of the proposed algorithm on VGGNet, DenseNet and ResNet on CIFAR 10/100. Our experiments demonstrate that after one round of training, proximal NS yields a CNN with competitive accuracy and compression. ### Structured Network Pruning by Measuring Filter-wise Interactions - **Authors:** Wenting Tang, Xingxing Wei, Bo Li (Beijing Key Laboratory of Digital Media, School of Computer Science and Engineering, Beihang University, Beijing, China) - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00758 - **Pdf link:** https://arxiv.org/pdf/2307.00758 - **Abstract** Structured network pruning is a practical approach to reduce computation cost directly while retaining the CNNs' generalization performance in real applications. However, identifying redundant filters is a core problem in structured network pruning, and current redundancy criteria only focus on individual filters' attributes. When pruning sparsity increases, these redundancy criteria are not effective or efficient enough. Since the filter-wise interaction also contributes to the CNN's prediction accuracy, we integrate the filter-wise interaction into the redundancy criterion. In our criterion, we introduce the filter importance and filter utilization strength to reflect the decision ability of individual and multiple filters. Utilizing this new redundancy criterion, we propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction). During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength and eliminate the useless filters by filter importance. After the pruning, the SNPFI can recover pruned model's performance effectively without iterative training by minimizing the interaction difference. We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models, including AlexNet, MobileNetv1, and ResNet-50, on various image classification datasets, including MNIST, CIFAR-10, and ImageNet. For all experimental CNN models, nearly 60% of computation is reduced in a network compression while the classification accuracy remains. ### NeuBTF: Neural fields for BTF encoding and transfer - **Authors:** Carlos Rodriguez-Pardo, Konstantinos Kazatzis, Jorge Lopez-Moreno, Elena Garces - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Graphics (cs.GR); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2307.01199 - **Pdf link:** https://arxiv.org/pdf/2307.01199 - **Abstract** Neural material representations are becoming a popular way to represent materials for rendering. They are more expressive than analytic models and occupy less memory than tabulated BTFs. However, existing neural materials are immutable, meaning that their output for a certain query of UVs, camera, and light vector is fixed once they are trained. While this is practical when there is no need to edit the material, it can become very limiting when the fragment of the material used for training is too small or not tileable, which frequently happens when the material has been captured with a gonioreflectometer. In this paper, we propose a novel neural material representation which jointly tackles the problems of BTF compression, tiling, and extrapolation. At test time, our method uses a guidance image as input to condition the neural BTF to the structural features of this input image. Then, the neural BTF can be queried as a regular BTF using UVs, camera, and light vectors. Every component in our framework is purposefully designed to maximize BTF encoding quality at minimal parameter count and computational complexity, achieving competitive compression rates compared with previous work. We demonstrate the results of our method on a variety of synthetic and captured materials, showing its generality and capacity to learn to represent many optical properties. ## Keyword: RAW ### Situated Cameras, Situated Knowledges: Towards an Egocentric Epistemology for Computer Vision - **Authors:** Samuel Goree, David Crandall - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00064 - **Pdf link:** https://arxiv.org/pdf/2307.00064 - **Abstract** In her influential 1988 paper, Situated Knowledges, Donna Haraway uses vision and perspective as a metaphor to discuss scientific knowledge. Today, egocentric computer vision discusses many of the same issues, except in a literal vision context. In this short position paper, we collapse that metaphor, and explore the interactions between feminist epistemology and egocentric CV as "Egocentric Epistemology." Using this framework, we argue for the use of qualitative, human-centric methods as a complement to performance benchmarks, to center both the literal and metaphorical perspective of human crowd workers in CV. ### Brightness-Restricted Adversarial Attack Patch - **Authors:** Mingzhen Shao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00421 - **Pdf link:** https://arxiv.org/pdf/2307.00421 - **Abstract** Adversarial attack patches have gained increasing attention due to their practical applicability in physical-world scenarios. However, the bright colors used in attack patches represent a significant drawback, as they can be easily identified by human observers. Moreover, even though these attacks have been highly successful in deceiving target networks, which specific features of the attack patch contribute to its success are still unknown. Our paper introduces a brightness-restricted patch (BrPatch) that uses optical characteristics to effectively reduce conspicuousness while preserving image independence. We also conducted an analysis of the impact of various image features (such as color, texture, noise, and size) on the effectiveness of an attack patch in physical-world deployment. Our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise. Based on our findings, we propose some additional methods to further reduce the conspicuousness of BrPatch. Our findings also explain the robustness of attack patches observed in physical-world scenarios. ### Referring Video Object Segmentation with Inter-Frame Interaction and Cross-Modal Correlation - **Authors:** Meng Lan, Fu Rong, Lefei Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00536 - **Pdf link:** https://arxiv.org/pdf/2307.00536 - **Abstract** Referring video object segmentation (RVOS) aims to segment the target object in a video sequence described by a language expression. Typical query-based methods process the video sequence in a frame-independent manner to reduce the high computational cost, which however affects the performance due to the lack of inter-frame interaction for temporal coherence modeling and spatio-temporal representation learning of the referred object. Besides, they directly adopt the raw and high-level sentence feature as the language queries to decode the visual features, where the weak correlation between visual and linguistic features also increases the difficulty of decoding the target information and limits the performance of the model. In this paper, we proposes a novel RVOS framework, dubbed IFIRVOS, to address these issues. Specifically, we design a plug-and-play inter-frame interaction module in the Transformer decoder to efficiently learn the spatio-temporal features of the referred object, so as to decode the object information in the video sequence more precisely and generate more accurate segmentation results. Moreover, we devise the vision-language interaction module before the multimodal Transformer to enhance the correlation between the visual and linguistic features, thus facilitating the process of decoding object information from visual features by language queries in Transformer decoder and improving the segmentation performance. Extensive experimental results on three benchmarks validate the superiority of our IFIRVOS over state-of-the-art methods and the effectiveness of our proposed modules. ### Intra- & Extra-Source Exemplar-Based Style Synthesis for Improved Domain Generalization - **Authors:** Yumeng Li, Dan Zhang, Margret Keuper, Anna Khoreva - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2307.00648 - **Pdf link:** https://arxiv.org/pdf/2307.00648 - **Abstract** The generalization with respect to domain shifts, as they frequently appear in applications such as autonomous driving, is one of the remaining big challenges for deep learning models. Therefore, we propose an exemplar-based style synthesis pipeline to improve domain generalization in semantic segmentation. Our method is based on a novel masked noise encoder for StyleGAN2 inversion. The model learns to faithfully reconstruct the image, preserving its semantic layout through noise prediction. Using the proposed masked noise encoder to randomize style and content combinations in the training set, i.e., intra-source style augmentation (ISSA) effectively increases the diversity of training data and reduces spurious correlation. As a result, we achieve up to $12.4\%$ mIoU improvements on driving-scene semantic segmentation under different types of data shifts, i.e., changing geographic locations, adverse weather conditions, and day to night. ISSA is model-agnostic and straightforwardly applicable with CNNs and Transformers. It is also complementary to other domain generalization techniques, e.g., it improves the recent state-of-the-art solution RobustNet by $3\%$ mIoU in Cityscapes to Dark Z\"urich. In addition, we demonstrate the strong plug-n-play ability of the proposed style synthesis pipeline, which is readily usable for extra-source exemplars e.g., web-crawled images, without any retraining or fine-tuning. Moreover, we study a new use case to indicate neural network's generalization capability by building a stylized proxy validation set. This application has significant practical sense for selecting models to be deployed in the open-world environment. Our code is available at \url{https://github.com/boschresearch/ISSA}. ### ACDMSR: Accelerated Conditional Diffusion Models for Single Image Super-Resolution - **Authors:** Axi Niu, Pham Xuan Trung, Kang Zhang, Jinqiu Sun, Yu Zhu, In So Kweon, Yanning Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00781 - **Pdf link:** https://arxiv.org/pdf/2307.00781 - **Abstract** Diffusion models have gained significant popularity in the field of image-to-image translation. Previous efforts applying diffusion models to image super-resolution (SR) have demonstrated that iteratively refining pure Gaussian noise using a U-Net architecture trained on denoising at various noise levels can yield satisfactory high-resolution images from low-resolution inputs. However, this iterative refinement process comes with the drawback of low inference speed, which strongly limits its applications. To speed up inference and further enhance the performance, our research revisits diffusion models in image super-resolution and proposes a straightforward yet significant diffusion model-based super-resolution method called ACDMSR (accelerated conditional diffusion model for image super-resolution). Specifically, our method adapts the standard diffusion model to perform super-resolution through a deterministic iterative denoising process. Our study also highlights the effectiveness of using a pre-trained SR model to provide the conditional image of the given low-resolution (LR) image to achieve superior high-resolution results. We demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as Set5, Set14, Urban100, BSD100, and Manga109. Moreover, our approach generates more visually realistic counterparts for low-resolution images, emphasizing its effectiveness in practical scenarios. ### Visual Instruction Tuning with Polite Flamingo - **Authors:** Delong Chen, Jianfeng Liu, Wenliang Dai, Baoyuan Wang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Computation and Language (cs.CL) - **Arxiv link:** https://arxiv.org/abs/2307.01003 - **Pdf link:** https://arxiv.org/pdf/2307.01003 - **Abstract** Recent research has demonstrated that the multi-task fine-tuning of multi-modal Large Language Models (LLMs) using an assortment of annotated downstream vision-language datasets significantly enhances their performance. Yet, during this process, a side effect, which we termed as the "multi-modal alignment tax", surfaces. This side effect negatively impacts the model's ability to format responses appropriately -- for instance, its "politeness" -- due to the overly succinct and unformatted nature of raw annotations, resulting in reduced human preference. In this paper, we introduce Polite Flamingo, a multi-modal response rewriter that transforms raw annotations into a more appealing, "polite" format. Polite Flamingo is trained to reconstruct high-quality responses from their automatically distorted counterparts and is subsequently applied to a vast array of vision-language datasets for response rewriting. After rigorous filtering, we generate the PF-1M dataset and further validate its value by fine-tuning a multi-modal LLM with it. Combined with novel methodologies including U-shaped multi-stage tuning and multi-turn augmentation, the resulting model, Clever Flamingo, demonstrates its advantages in both multi-modal understanding and response politeness according to automated and human evaluations. ### SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation - **Authors:** Liangliang Yao, Haobo Zuo, Guangze Zheng, Changhong Fu, Jia Pan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.01024 - **Pdf link:** https://arxiv.org/pdf/2307.01024 - **Abstract** Domain adaptation (DA) has demonstrated significant promise for real-time nighttime unmanned aerial vehicle (UAV) tracking. However, the state-of-the-art (SOTA) DA still lacks the potential object with accurate pixel-level location and boundary to generate the high-quality target domain training sample. This key issue constrains the transfer learning of the real-time daytime SOTA trackers for challenging nighttime UAV tracking. Recently, the notable Segment Anything Model (SAM) has achieved remarkable zero-shot generalization ability to discover abundant potential objects due to its huge data-driven training approach. To solve the aforementioned issue, this work proposes a novel SAM-powered DA framework for real-time nighttime UAV tracking, i.e., SAM-DA. Specifically, an innovative SAM-powered target domain training sample swelling is designed to determine enormous high-quality target domain training samples from every single raw nighttime image. This novel one-to-many method significantly expands the high-quality target domain training sample for DA. Comprehensive experiments on extensive nighttime UAV videos prove the robustness and domain adaptability of SAM-DA for nighttime UAV tracking. Especially, compared to the SOTA DA, SAM-DA can achieve better performance with fewer raw nighttime images, i.e., the fewer-better training. This economized training approach facilitates the quick validation and deployment of algorithms for UAVs. The code is available at https://github.com/vision4robotics/SAM-DA. ## Keyword: raw image There is no result
2.0
New submissions for Tue, 4 Jul 23 - ## Keyword: events ### A MIL Approach for Anomaly Detection in Surveillance Videos from Multiple Camera Views - **Authors:** Silas Santiago Lopes Pereira, José Everardo Bessa Maia - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00562 - **Pdf link:** https://arxiv.org/pdf/2307.00562 - **Abstract** Occlusion and clutter are two scene states that make it difficult to detect anomalies in surveillance video. Furthermore, anomaly events are rare and, as a consequence, class imbalance and lack of labeled anomaly data are also key features of this task. Therefore, weakly supervised methods are heavily researched for this application. In this paper, we tackle these typical problems of anomaly detection in surveillance video by combining Multiple Instance Learning (MIL) to deal with the lack of labels and Multiple Camera Views (MC) to reduce occlusion and clutter effects. In the resulting MC-MIL algorithm we apply a multiple camera combined loss function to train a regression network with Sultani's MIL ranking function. To evaluate the MC-MIL algorithm first proposed here, the multiple camera PETS-2009 benchmark dataset was re-labeled for the anomaly detection task from multiple camera views. The result shows a significant performance improvement in F1 score compared to the single-camera configuration. ## Keyword: event camera ### Cross-modal Place Recognition in Image Databases using Event-based Sensors - **Authors:** Xiang Ji, Jiaxin Wei, Yifu Wang, Huiliang Shang, Laurent Kneip - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.01047 - **Pdf link:** https://arxiv.org/pdf/2307.01047 - **Abstract** Visual place recognition is an important problem towards global localization in many robotics tasks. One of the biggest challenges is that it may suffer from illumination or appearance changes in surrounding environments. Event cameras are interesting alternatives to frame-based sensors as their high dynamic range enables robust perception in difficult illumination conditions. However, current event-based place recognition methods only rely on event information, which restricts downstream applications of VPR. In this paper, we present the first cross-modal visual place recognition framework that is capable of retrieving regular images from a database given an event query. Our method demonstrates promising results with respect to the state-of-the-art frame-based and event-based methods on the Brisbane-Event-VPR dataset under different scenarios. We also verify the effectiveness of the combination of retrieval and classification, which can boost performance by a large margin. ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Brightness-Restricted Adversarial Attack Patch - **Authors:** Mingzhen Shao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00421 - **Pdf link:** https://arxiv.org/pdf/2307.00421 - **Abstract** Adversarial attack patches have gained increasing attention due to their practical applicability in physical-world scenarios. However, the bright colors used in attack patches represent a significant drawback, as they can be easily identified by human observers. Moreover, even though these attacks have been highly successful in deceiving target networks, which specific features of the attack patch contribute to its success are still unknown. Our paper introduces a brightness-restricted patch (BrPatch) that uses optical characteristics to effectively reduce conspicuousness while preserving image independence. We also conducted an analysis of the impact of various image features (such as color, texture, noise, and size) on the effectiveness of an attack patch in physical-world deployment. Our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise. Based on our findings, we propose some additional methods to further reduce the conspicuousness of BrPatch. Our findings also explain the robustness of attack patches observed in physical-world scenarios. ### ACDMSR: Accelerated Conditional Diffusion Models for Single Image Super-Resolution - **Authors:** Axi Niu, Pham Xuan Trung, Kang Zhang, Jinqiu Sun, Yu Zhu, In So Kweon, Yanning Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00781 - **Pdf link:** https://arxiv.org/pdf/2307.00781 - **Abstract** Diffusion models have gained significant popularity in the field of image-to-image translation. Previous efforts applying diffusion models to image super-resolution (SR) have demonstrated that iteratively refining pure Gaussian noise using a U-Net architecture trained on denoising at various noise levels can yield satisfactory high-resolution images from low-resolution inputs. However, this iterative refinement process comes with the drawback of low inference speed, which strongly limits its applications. To speed up inference and further enhance the performance, our research revisits diffusion models in image super-resolution and proposes a straightforward yet significant diffusion model-based super-resolution method called ACDMSR (accelerated conditional diffusion model for image super-resolution). Specifically, our method adapts the standard diffusion model to perform super-resolution through a deterministic iterative denoising process. Our study also highlights the effectiveness of using a pre-trained SR model to provide the conditional image of the given low-resolution (LR) image to achieve superior high-resolution results. We demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as Set5, Set14, Urban100, BSD100, and Manga109. Moreover, our approach generates more visually realistic counterparts for low-resolution images, emphasizing its effectiveness in practical scenarios. ## Keyword: ISP ### SDRCNN: A single-scale dense residual connected convolutional neural network for pansharpening - **Authors:** Yuan Fang, Yuanzhi Cai, Lei Fan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00327 - **Pdf link:** https://arxiv.org/pdf/2307.00327 - **Abstract** Pansharpening is a process of fusing a high spatial resolution panchromatic image and a low spatial resolution multispectral image to create a high-resolution multispectral image. A novel single-branch, single-scale lightweight convolutional neural network, named SDRCNN, is developed in this study. By using a novel dense residual connected structure and convolution block, SDRCNN achieved a better trade-off between accuracy and efficiency. The performance of SDRCNN was tested using four datasets from the WorldView-3, WorldView-2 and QuickBird satellites. The compared methods include eight traditional methods (i.e., GS, GSA, PRACS, BDSD, SFIM, GLP-CBD, CDIF and LRTCFPan) and five lightweight deep learning methods (i.e., PNN, PanNet, BayesianNet, DMDNet and FusionNet). Based on a visual inspection of the pansharpened images created and the associated absolute residual maps, SDRCNN exhibited least spatial detail blurring and spectral distortion, amongst all the methods considered. The values of the quantitative evaluation metrics were closest to their ideal values when SDRCNN was used. The processing time of SDRCNN was also the shortest among all methods tested. Finally, the effectiveness of each component in the SDRCNN was demonstrated in ablation experiments. All of these confirmed the superiority of SDRCNN. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### A Proximal Algorithm for Network Slimming - **Authors:** Kevin Bui, Fanghui Xue, Fredrick Park, Yingyong Qi, Jack Xin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00684 - **Pdf link:** https://arxiv.org/pdf/2307.00684 - **Abstract** As a popular channel pruning method for convolutional neural networks (CNNs), network slimming (NS) has a three-stage process: (1) it trains a CNN with $\ell_1$ regularization applied to the scaling factors of the batch normalization layers; (2) it removes channels whose scaling factors are below a chosen threshold; and (3) it retrains the pruned model to recover the original accuracy. This time-consuming, three-step process is a result of using subgradient descent to train CNNs. Because subgradient descent does not exactly train CNNs towards sparse, accurate structures, the latter two steps are necessary. Moreover, subgradient descent does not have any convergence guarantee. Therefore, we develop an alternative algorithm called proximal NS. Our proposed algorithm trains CNNs towards sparse, accurate structures, so identifying a scaling factor threshold is unnecessary and fine tuning the pruned CNNs is optional. Using Kurdyka-{\L}ojasiewicz assumptions, we establish global convergence of proximal NS. Lastly, we validate the efficacy of the proposed algorithm on VGGNet, DenseNet and ResNet on CIFAR 10/100. Our experiments demonstrate that after one round of training, proximal NS yields a CNN with competitive accuracy and compression. ### Structured Network Pruning by Measuring Filter-wise Interactions - **Authors:** Wenting Tang, Xingxing Wei, Bo Li (Beijing Key Laboratory of Digital Media, School of Computer Science and Engineering, Beihang University, Beijing, China) - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00758 - **Pdf link:** https://arxiv.org/pdf/2307.00758 - **Abstract** Structured network pruning is a practical approach to reduce computation cost directly while retaining the CNNs' generalization performance in real applications. However, identifying redundant filters is a core problem in structured network pruning, and current redundancy criteria only focus on individual filters' attributes. When pruning sparsity increases, these redundancy criteria are not effective or efficient enough. Since the filter-wise interaction also contributes to the CNN's prediction accuracy, we integrate the filter-wise interaction into the redundancy criterion. In our criterion, we introduce the filter importance and filter utilization strength to reflect the decision ability of individual and multiple filters. Utilizing this new redundancy criterion, we propose a structured network pruning approach SNPFI (Structured Network Pruning by measuring Filter-wise Interaction). During the pruning, the SNPFI can automatically assign the proper sparsity based on the filter utilization strength and eliminate the useless filters by filter importance. After the pruning, the SNPFI can recover pruned model's performance effectively without iterative training by minimizing the interaction difference. We empirically demonstrate the effectiveness of the SNPFI with several commonly used CNN models, including AlexNet, MobileNetv1, and ResNet-50, on various image classification datasets, including MNIST, CIFAR-10, and ImageNet. For all experimental CNN models, nearly 60% of computation is reduced in a network compression while the classification accuracy remains. ### NeuBTF: Neural fields for BTF encoding and transfer - **Authors:** Carlos Rodriguez-Pardo, Konstantinos Kazatzis, Jorge Lopez-Moreno, Elena Garces - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Graphics (cs.GR); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2307.01199 - **Pdf link:** https://arxiv.org/pdf/2307.01199 - **Abstract** Neural material representations are becoming a popular way to represent materials for rendering. They are more expressive than analytic models and occupy less memory than tabulated BTFs. However, existing neural materials are immutable, meaning that their output for a certain query of UVs, camera, and light vector is fixed once they are trained. While this is practical when there is no need to edit the material, it can become very limiting when the fragment of the material used for training is too small or not tileable, which frequently happens when the material has been captured with a gonioreflectometer. In this paper, we propose a novel neural material representation which jointly tackles the problems of BTF compression, tiling, and extrapolation. At test time, our method uses a guidance image as input to condition the neural BTF to the structural features of this input image. Then, the neural BTF can be queried as a regular BTF using UVs, camera, and light vectors. Every component in our framework is purposefully designed to maximize BTF encoding quality at minimal parameter count and computational complexity, achieving competitive compression rates compared with previous work. We demonstrate the results of our method on a variety of synthetic and captured materials, showing its generality and capacity to learn to represent many optical properties. ## Keyword: RAW ### Situated Cameras, Situated Knowledges: Towards an Egocentric Epistemology for Computer Vision - **Authors:** Samuel Goree, David Crandall - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00064 - **Pdf link:** https://arxiv.org/pdf/2307.00064 - **Abstract** In her influential 1988 paper, Situated Knowledges, Donna Haraway uses vision and perspective as a metaphor to discuss scientific knowledge. Today, egocentric computer vision discusses many of the same issues, except in a literal vision context. In this short position paper, we collapse that metaphor, and explore the interactions between feminist epistemology and egocentric CV as "Egocentric Epistemology." Using this framework, we argue for the use of qualitative, human-centric methods as a complement to performance benchmarks, to center both the literal and metaphorical perspective of human crowd workers in CV. ### Brightness-Restricted Adversarial Attack Patch - **Authors:** Mingzhen Shao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00421 - **Pdf link:** https://arxiv.org/pdf/2307.00421 - **Abstract** Adversarial attack patches have gained increasing attention due to their practical applicability in physical-world scenarios. However, the bright colors used in attack patches represent a significant drawback, as they can be easily identified by human observers. Moreover, even though these attacks have been highly successful in deceiving target networks, which specific features of the attack patch contribute to its success are still unknown. Our paper introduces a brightness-restricted patch (BrPatch) that uses optical characteristics to effectively reduce conspicuousness while preserving image independence. We also conducted an analysis of the impact of various image features (such as color, texture, noise, and size) on the effectiveness of an attack patch in physical-world deployment. Our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise. Based on our findings, we propose some additional methods to further reduce the conspicuousness of BrPatch. Our findings also explain the robustness of attack patches observed in physical-world scenarios. ### Referring Video Object Segmentation with Inter-Frame Interaction and Cross-Modal Correlation - **Authors:** Meng Lan, Fu Rong, Lefei Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.00536 - **Pdf link:** https://arxiv.org/pdf/2307.00536 - **Abstract** Referring video object segmentation (RVOS) aims to segment the target object in a video sequence described by a language expression. Typical query-based methods process the video sequence in a frame-independent manner to reduce the high computational cost, which however affects the performance due to the lack of inter-frame interaction for temporal coherence modeling and spatio-temporal representation learning of the referred object. Besides, they directly adopt the raw and high-level sentence feature as the language queries to decode the visual features, where the weak correlation between visual and linguistic features also increases the difficulty of decoding the target information and limits the performance of the model. In this paper, we proposes a novel RVOS framework, dubbed IFIRVOS, to address these issues. Specifically, we design a plug-and-play inter-frame interaction module in the Transformer decoder to efficiently learn the spatio-temporal features of the referred object, so as to decode the object information in the video sequence more precisely and generate more accurate segmentation results. Moreover, we devise the vision-language interaction module before the multimodal Transformer to enhance the correlation between the visual and linguistic features, thus facilitating the process of decoding object information from visual features by language queries in Transformer decoder and improving the segmentation performance. Extensive experimental results on three benchmarks validate the superiority of our IFIRVOS over state-of-the-art methods and the effectiveness of our proposed modules. ### Intra- & Extra-Source Exemplar-Based Style Synthesis for Improved Domain Generalization - **Authors:** Yumeng Li, Dan Zhang, Margret Keuper, Anna Khoreva - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2307.00648 - **Pdf link:** https://arxiv.org/pdf/2307.00648 - **Abstract** The generalization with respect to domain shifts, as they frequently appear in applications such as autonomous driving, is one of the remaining big challenges for deep learning models. Therefore, we propose an exemplar-based style synthesis pipeline to improve domain generalization in semantic segmentation. Our method is based on a novel masked noise encoder for StyleGAN2 inversion. The model learns to faithfully reconstruct the image, preserving its semantic layout through noise prediction. Using the proposed masked noise encoder to randomize style and content combinations in the training set, i.e., intra-source style augmentation (ISSA) effectively increases the diversity of training data and reduces spurious correlation. As a result, we achieve up to $12.4\%$ mIoU improvements on driving-scene semantic segmentation under different types of data shifts, i.e., changing geographic locations, adverse weather conditions, and day to night. ISSA is model-agnostic and straightforwardly applicable with CNNs and Transformers. It is also complementary to other domain generalization techniques, e.g., it improves the recent state-of-the-art solution RobustNet by $3\%$ mIoU in Cityscapes to Dark Z\"urich. In addition, we demonstrate the strong plug-n-play ability of the proposed style synthesis pipeline, which is readily usable for extra-source exemplars e.g., web-crawled images, without any retraining or fine-tuning. Moreover, we study a new use case to indicate neural network's generalization capability by building a stylized proxy validation set. This application has significant practical sense for selecting models to be deployed in the open-world environment. Our code is available at \url{https://github.com/boschresearch/ISSA}. ### ACDMSR: Accelerated Conditional Diffusion Models for Single Image Super-Resolution - **Authors:** Axi Niu, Pham Xuan Trung, Kang Zhang, Jinqiu Sun, Yu Zhu, In So Kweon, Yanning Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.00781 - **Pdf link:** https://arxiv.org/pdf/2307.00781 - **Abstract** Diffusion models have gained significant popularity in the field of image-to-image translation. Previous efforts applying diffusion models to image super-resolution (SR) have demonstrated that iteratively refining pure Gaussian noise using a U-Net architecture trained on denoising at various noise levels can yield satisfactory high-resolution images from low-resolution inputs. However, this iterative refinement process comes with the drawback of low inference speed, which strongly limits its applications. To speed up inference and further enhance the performance, our research revisits diffusion models in image super-resolution and proposes a straightforward yet significant diffusion model-based super-resolution method called ACDMSR (accelerated conditional diffusion model for image super-resolution). Specifically, our method adapts the standard diffusion model to perform super-resolution through a deterministic iterative denoising process. Our study also highlights the effectiveness of using a pre-trained SR model to provide the conditional image of the given low-resolution (LR) image to achieve superior high-resolution results. We demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as Set5, Set14, Urban100, BSD100, and Manga109. Moreover, our approach generates more visually realistic counterparts for low-resolution images, emphasizing its effectiveness in practical scenarios. ### Visual Instruction Tuning with Polite Flamingo - **Authors:** Delong Chen, Jianfeng Liu, Wenliang Dai, Baoyuan Wang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Computation and Language (cs.CL) - **Arxiv link:** https://arxiv.org/abs/2307.01003 - **Pdf link:** https://arxiv.org/pdf/2307.01003 - **Abstract** Recent research has demonstrated that the multi-task fine-tuning of multi-modal Large Language Models (LLMs) using an assortment of annotated downstream vision-language datasets significantly enhances their performance. Yet, during this process, a side effect, which we termed as the "multi-modal alignment tax", surfaces. This side effect negatively impacts the model's ability to format responses appropriately -- for instance, its "politeness" -- due to the overly succinct and unformatted nature of raw annotations, resulting in reduced human preference. In this paper, we introduce Polite Flamingo, a multi-modal response rewriter that transforms raw annotations into a more appealing, "polite" format. Polite Flamingo is trained to reconstruct high-quality responses from their automatically distorted counterparts and is subsequently applied to a vast array of vision-language datasets for response rewriting. After rigorous filtering, we generate the PF-1M dataset and further validate its value by fine-tuning a multi-modal LLM with it. Combined with novel methodologies including U-shaped multi-stage tuning and multi-turn augmentation, the resulting model, Clever Flamingo, demonstrates its advantages in both multi-modal understanding and response politeness according to automated and human evaluations. ### SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation - **Authors:** Liangliang Yao, Haobo Zuo, Guangze Zheng, Changhong Fu, Jia Pan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.01024 - **Pdf link:** https://arxiv.org/pdf/2307.01024 - **Abstract** Domain adaptation (DA) has demonstrated significant promise for real-time nighttime unmanned aerial vehicle (UAV) tracking. However, the state-of-the-art (SOTA) DA still lacks the potential object with accurate pixel-level location and boundary to generate the high-quality target domain training sample. This key issue constrains the transfer learning of the real-time daytime SOTA trackers for challenging nighttime UAV tracking. Recently, the notable Segment Anything Model (SAM) has achieved remarkable zero-shot generalization ability to discover abundant potential objects due to its huge data-driven training approach. To solve the aforementioned issue, this work proposes a novel SAM-powered DA framework for real-time nighttime UAV tracking, i.e., SAM-DA. Specifically, an innovative SAM-powered target domain training sample swelling is designed to determine enormous high-quality target domain training samples from every single raw nighttime image. This novel one-to-many method significantly expands the high-quality target domain training sample for DA. Comprehensive experiments on extensive nighttime UAV videos prove the robustness and domain adaptability of SAM-DA for nighttime UAV tracking. Especially, compared to the SOTA DA, SAM-DA can achieve better performance with fewer raw nighttime images, i.e., the fewer-better training. This economized training approach facilitates the quick validation and deployment of algorithms for UAVs. The code is available at https://github.com/vision4robotics/SAM-DA. ## Keyword: raw image There is no result
process
new submissions for tue jul keyword events a mil approach for anomaly detection in surveillance videos from multiple camera views authors silas santiago lopes pereira josé everardo bessa maia subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract occlusion and clutter are two scene states that make it difficult to detect anomalies in surveillance video furthermore anomaly events are rare and as a consequence class imbalance and lack of labeled anomaly data are also key features of this task therefore weakly supervised methods are heavily researched for this application in this paper we tackle these typical problems of anomaly detection in surveillance video by combining multiple instance learning mil to deal with the lack of labels and multiple camera views mc to reduce occlusion and clutter effects in the resulting mc mil algorithm we apply a multiple camera combined loss function to train a regression network with sultani s mil ranking function to evaluate the mc mil algorithm first proposed here the multiple camera pets benchmark dataset was re labeled for the anomaly detection task from multiple camera views the result shows a significant performance improvement in score compared to the single camera configuration keyword event camera cross modal place recognition in image databases using event based sensors authors xiang ji jiaxin wei yifu wang huiliang shang laurent kneip subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract visual place recognition is an important problem towards global localization in many robotics tasks one of the biggest challenges is that it may suffer from illumination or appearance changes in surrounding environments event cameras are interesting alternatives to frame based sensors as their high dynamic range enables robust perception in difficult illumination conditions however current event based place recognition methods only rely on event information which restricts downstream applications of vpr in this paper we present the first cross modal visual place recognition framework that is capable of retrieving regular images from a database given an event query our method demonstrates promising results with respect to the state of the art frame based and event based methods on the brisbane event vpr dataset under different scenarios we also verify the effectiveness of the combination of retrieval and classification which can boost performance by a large margin keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb brightness restricted adversarial attack patch authors mingzhen shao subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract adversarial attack patches have gained increasing attention due to their practical applicability in physical world scenarios however the bright colors used in attack patches represent a significant drawback as they can be easily identified by human observers moreover even though these attacks have been highly successful in deceiving target networks which specific features of the attack patch contribute to its success are still unknown our paper introduces a brightness restricted patch brpatch that uses optical characteristics to effectively reduce conspicuousness while preserving image independence we also conducted an analysis of the impact of various image features such as color texture noise and size on the effectiveness of an attack patch in physical world deployment our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise based on our findings we propose some additional methods to further reduce the conspicuousness of brpatch our findings also explain the robustness of attack patches observed in physical world scenarios acdmsr accelerated conditional diffusion models for single image super resolution authors axi niu pham xuan trung kang zhang jinqiu sun yu zhu in so kweon yanning zhang subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract diffusion models have gained significant popularity in the field of image to image translation previous efforts applying diffusion models to image super resolution sr have demonstrated that iteratively refining pure gaussian noise using a u net architecture trained on denoising at various noise levels can yield satisfactory high resolution images from low resolution inputs however this iterative refinement process comes with the drawback of low inference speed which strongly limits its applications to speed up inference and further enhance the performance our research revisits diffusion models in image super resolution and proposes a straightforward yet significant diffusion model based super resolution method called acdmsr accelerated conditional diffusion model for image super resolution specifically our method adapts the standard diffusion model to perform super resolution through a deterministic iterative denoising process our study also highlights the effectiveness of using a pre trained sr model to provide the conditional image of the given low resolution lr image to achieve superior high resolution results we demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as and moreover our approach generates more visually realistic counterparts for low resolution images emphasizing its effectiveness in practical scenarios keyword isp sdrcnn a single scale dense residual connected convolutional neural network for pansharpening authors yuan fang yuanzhi cai lei fan subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract pansharpening is a process of fusing a high spatial resolution panchromatic image and a low spatial resolution multispectral image to create a high resolution multispectral image a novel single branch single scale lightweight convolutional neural network named sdrcnn is developed in this study by using a novel dense residual connected structure and convolution block sdrcnn achieved a better trade off between accuracy and efficiency the performance of sdrcnn was tested using four datasets from the worldview worldview and quickbird satellites the compared methods include eight traditional methods i e gs gsa pracs bdsd sfim glp cbd cdif and lrtcfpan and five lightweight deep learning methods i e pnn pannet bayesiannet dmdnet and fusionnet based on a visual inspection of the pansharpened images created and the associated absolute residual maps sdrcnn exhibited least spatial detail blurring and spectral distortion amongst all the methods considered the values of the quantitative evaluation metrics were closest to their ideal values when sdrcnn was used the processing time of sdrcnn was also the shortest among all methods tested finally the effectiveness of each component in the sdrcnn was demonstrated in ablation experiments all of these confirmed the superiority of sdrcnn keyword image signal processing there is no result keyword image signal process there is no result keyword compression a proximal algorithm for network slimming authors kevin bui fanghui xue fredrick park yingyong qi jack xin subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract as a popular channel pruning method for convolutional neural networks cnns network slimming ns has a three stage process it trains a cnn with ell regularization applied to the scaling factors of the batch normalization layers it removes channels whose scaling factors are below a chosen threshold and it retrains the pruned model to recover the original accuracy this time consuming three step process is a result of using subgradient descent to train cnns because subgradient descent does not exactly train cnns towards sparse accurate structures the latter two steps are necessary moreover subgradient descent does not have any convergence guarantee therefore we develop an alternative algorithm called proximal ns our proposed algorithm trains cnns towards sparse accurate structures so identifying a scaling factor threshold is unnecessary and fine tuning the pruned cnns is optional using kurdyka l ojasiewicz assumptions we establish global convergence of proximal ns lastly we validate the efficacy of the proposed algorithm on vggnet densenet and resnet on cifar our experiments demonstrate that after one round of training proximal ns yields a cnn with competitive accuracy and compression structured network pruning by measuring filter wise interactions authors wenting tang xingxing wei bo li beijing key laboratory of digital media school of computer science and engineering beihang university beijing china subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract structured network pruning is a practical approach to reduce computation cost directly while retaining the cnns generalization performance in real applications however identifying redundant filters is a core problem in structured network pruning and current redundancy criteria only focus on individual filters attributes when pruning sparsity increases these redundancy criteria are not effective or efficient enough since the filter wise interaction also contributes to the cnn s prediction accuracy we integrate the filter wise interaction into the redundancy criterion in our criterion we introduce the filter importance and filter utilization strength to reflect the decision ability of individual and multiple filters utilizing this new redundancy criterion we propose a structured network pruning approach snpfi structured network pruning by measuring filter wise interaction during the pruning the snpfi can automatically assign the proper sparsity based on the filter utilization strength and eliminate the useless filters by filter importance after the pruning the snpfi can recover pruned model s performance effectively without iterative training by minimizing the interaction difference we empirically demonstrate the effectiveness of the snpfi with several commonly used cnn models including alexnet and resnet on various image classification datasets including mnist cifar and imagenet for all experimental cnn models nearly of computation is reduced in a network compression while the classification accuracy remains neubtf neural fields for btf encoding and transfer authors carlos rodriguez pardo konstantinos kazatzis jorge lopez moreno elena garces subjects computer vision and pattern recognition cs cv artificial intelligence cs ai graphics cs gr machine learning cs lg arxiv link pdf link abstract neural material representations are becoming a popular way to represent materials for rendering they are more expressive than analytic models and occupy less memory than tabulated btfs however existing neural materials are immutable meaning that their output for a certain query of uvs camera and light vector is fixed once they are trained while this is practical when there is no need to edit the material it can become very limiting when the fragment of the material used for training is too small or not tileable which frequently happens when the material has been captured with a gonioreflectometer in this paper we propose a novel neural material representation which jointly tackles the problems of btf compression tiling and extrapolation at test time our method uses a guidance image as input to condition the neural btf to the structural features of this input image then the neural btf can be queried as a regular btf using uvs camera and light vectors every component in our framework is purposefully designed to maximize btf encoding quality at minimal parameter count and computational complexity achieving competitive compression rates compared with previous work we demonstrate the results of our method on a variety of synthetic and captured materials showing its generality and capacity to learn to represent many optical properties keyword raw situated cameras situated knowledges towards an egocentric epistemology for computer vision authors samuel goree david crandall subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract in her influential paper situated knowledges donna haraway uses vision and perspective as a metaphor to discuss scientific knowledge today egocentric computer vision discusses many of the same issues except in a literal vision context in this short position paper we collapse that metaphor and explore the interactions between feminist epistemology and egocentric cv as egocentric epistemology using this framework we argue for the use of qualitative human centric methods as a complement to performance benchmarks to center both the literal and metaphorical perspective of human crowd workers in cv brightness restricted adversarial attack patch authors mingzhen shao subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract adversarial attack patches have gained increasing attention due to their practical applicability in physical world scenarios however the bright colors used in attack patches represent a significant drawback as they can be easily identified by human observers moreover even though these attacks have been highly successful in deceiving target networks which specific features of the attack patch contribute to its success are still unknown our paper introduces a brightness restricted patch brpatch that uses optical characteristics to effectively reduce conspicuousness while preserving image independence we also conducted an analysis of the impact of various image features such as color texture noise and size on the effectiveness of an attack patch in physical world deployment our experiments show that attack patches exhibit strong redundancy to brightness and are resistant to color transfer and noise based on our findings we propose some additional methods to further reduce the conspicuousness of brpatch our findings also explain the robustness of attack patches observed in physical world scenarios referring video object segmentation with inter frame interaction and cross modal correlation authors meng lan fu rong lefei zhang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract referring video object segmentation rvos aims to segment the target object in a video sequence described by a language expression typical query based methods process the video sequence in a frame independent manner to reduce the high computational cost which however affects the performance due to the lack of inter frame interaction for temporal coherence modeling and spatio temporal representation learning of the referred object besides they directly adopt the raw and high level sentence feature as the language queries to decode the visual features where the weak correlation between visual and linguistic features also increases the difficulty of decoding the target information and limits the performance of the model in this paper we proposes a novel rvos framework dubbed ifirvos to address these issues specifically we design a plug and play inter frame interaction module in the transformer decoder to efficiently learn the spatio temporal features of the referred object so as to decode the object information in the video sequence more precisely and generate more accurate segmentation results moreover we devise the vision language interaction module before the multimodal transformer to enhance the correlation between the visual and linguistic features thus facilitating the process of decoding object information from visual features by language queries in transformer decoder and improving the segmentation performance extensive experimental results on three benchmarks validate the superiority of our ifirvos over state of the art methods and the effectiveness of our proposed modules intra extra source exemplar based style synthesis for improved domain generalization authors yumeng li dan zhang margret keuper anna khoreva subjects computer vision and pattern recognition cs cv artificial intelligence cs ai machine learning cs lg arxiv link pdf link abstract the generalization with respect to domain shifts as they frequently appear in applications such as autonomous driving is one of the remaining big challenges for deep learning models therefore we propose an exemplar based style synthesis pipeline to improve domain generalization in semantic segmentation our method is based on a novel masked noise encoder for inversion the model learns to faithfully reconstruct the image preserving its semantic layout through noise prediction using the proposed masked noise encoder to randomize style and content combinations in the training set i e intra source style augmentation issa effectively increases the diversity of training data and reduces spurious correlation as a result we achieve up to miou improvements on driving scene semantic segmentation under different types of data shifts i e changing geographic locations adverse weather conditions and day to night issa is model agnostic and straightforwardly applicable with cnns and transformers it is also complementary to other domain generalization techniques e g it improves the recent state of the art solution robustnet by miou in cityscapes to dark z urich in addition we demonstrate the strong plug n play ability of the proposed style synthesis pipeline which is readily usable for extra source exemplars e g web crawled images without any retraining or fine tuning moreover we study a new use case to indicate neural network s generalization capability by building a stylized proxy validation set this application has significant practical sense for selecting models to be deployed in the open world environment our code is available at url acdmsr accelerated conditional diffusion models for single image super resolution authors axi niu pham xuan trung kang zhang jinqiu sun yu zhu in so kweon yanning zhang subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract diffusion models have gained significant popularity in the field of image to image translation previous efforts applying diffusion models to image super resolution sr have demonstrated that iteratively refining pure gaussian noise using a u net architecture trained on denoising at various noise levels can yield satisfactory high resolution images from low resolution inputs however this iterative refinement process comes with the drawback of low inference speed which strongly limits its applications to speed up inference and further enhance the performance our research revisits diffusion models in image super resolution and proposes a straightforward yet significant diffusion model based super resolution method called acdmsr accelerated conditional diffusion model for image super resolution specifically our method adapts the standard diffusion model to perform super resolution through a deterministic iterative denoising process our study also highlights the effectiveness of using a pre trained sr model to provide the conditional image of the given low resolution lr image to achieve superior high resolution results we demonstrate that our method surpasses previous attempts in qualitative and quantitative results through extensive experiments conducted on benchmark datasets such as and moreover our approach generates more visually realistic counterparts for low resolution images emphasizing its effectiveness in practical scenarios visual instruction tuning with polite flamingo authors delong chen jianfeng liu wenliang dai baoyuan wang subjects computer vision and pattern recognition cs cv computation and language cs cl arxiv link pdf link abstract recent research has demonstrated that the multi task fine tuning of multi modal large language models llms using an assortment of annotated downstream vision language datasets significantly enhances their performance yet during this process a side effect which we termed as the multi modal alignment tax surfaces this side effect negatively impacts the model s ability to format responses appropriately for instance its politeness due to the overly succinct and unformatted nature of raw annotations resulting in reduced human preference in this paper we introduce polite flamingo a multi modal response rewriter that transforms raw annotations into a more appealing polite format polite flamingo is trained to reconstruct high quality responses from their automatically distorted counterparts and is subsequently applied to a vast array of vision language datasets for response rewriting after rigorous filtering we generate the pf dataset and further validate its value by fine tuning a multi modal llm with it combined with novel methodologies including u shaped multi stage tuning and multi turn augmentation the resulting model clever flamingo demonstrates its advantages in both multi modal understanding and response politeness according to automated and human evaluations sam da uav tracks anything at night with sam powered domain adaptation authors liangliang yao haobo zuo guangze zheng changhong fu jia pan subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract domain adaptation da has demonstrated significant promise for real time nighttime unmanned aerial vehicle uav tracking however the state of the art sota da still lacks the potential object with accurate pixel level location and boundary to generate the high quality target domain training sample this key issue constrains the transfer learning of the real time daytime sota trackers for challenging nighttime uav tracking recently the notable segment anything model sam has achieved remarkable zero shot generalization ability to discover abundant potential objects due to its huge data driven training approach to solve the aforementioned issue this work proposes a novel sam powered da framework for real time nighttime uav tracking i e sam da specifically an innovative sam powered target domain training sample swelling is designed to determine enormous high quality target domain training samples from every single raw nighttime image this novel one to many method significantly expands the high quality target domain training sample for da comprehensive experiments on extensive nighttime uav videos prove the robustness and domain adaptability of sam da for nighttime uav tracking especially compared to the sota da sam da can achieve better performance with fewer raw nighttime images i e the fewer better training this economized training approach facilitates the quick validation and deployment of algorithms for uavs the code is available at keyword raw image there is no result
1
5,034
7,852,584,795
IssuesEvent
2018-06-20 14:58:04
AlexsLemonade/refinebio
https://api.github.com/repos/AlexsLemonade/refinebio
closed
We need to utilize our `is_processed` fields
api processor review
### Context We currently have `is_processed` fields on the samples and original_files tables. However we don't ever actually set these fields to True. ### Problem or idea Because we aren't setting these fields ever we also aren't filtering our API's responses on that field. This means that the front end is showing results including all of the data we've surveyed, not just the data we've processed. There will be data that we surveyed and decided not to process along with data that we do want to process but haven't done so yet. Those kinds of data should not be exposed to users via the front end (although our API should support querying for them). ### Solution or next step Modify all of the processors to set `is_processed` to True for every sample they process. Make sure the API Sample endpoints support filtering on this field. Additionally, and this could be tricky, we need to find a way to enable filtering the Experiment endpoints to only return Experiments which have had all of their Samples processed. ### New Issue Checklist - [x] The title is short and descriptive. - [x] You have explained the context that led you to write this issue. - [x] You have reported a problem or idea. - [x] You have proposed a solution or next step.
1.0
We need to utilize our `is_processed` fields - ### Context We currently have `is_processed` fields on the samples and original_files tables. However we don't ever actually set these fields to True. ### Problem or idea Because we aren't setting these fields ever we also aren't filtering our API's responses on that field. This means that the front end is showing results including all of the data we've surveyed, not just the data we've processed. There will be data that we surveyed and decided not to process along with data that we do want to process but haven't done so yet. Those kinds of data should not be exposed to users via the front end (although our API should support querying for them). ### Solution or next step Modify all of the processors to set `is_processed` to True for every sample they process. Make sure the API Sample endpoints support filtering on this field. Additionally, and this could be tricky, we need to find a way to enable filtering the Experiment endpoints to only return Experiments which have had all of their Samples processed. ### New Issue Checklist - [x] The title is short and descriptive. - [x] You have explained the context that led you to write this issue. - [x] You have reported a problem or idea. - [x] You have proposed a solution or next step.
process
we need to utilize our is processed fields context we currently have is processed fields on the samples and original files tables however we don t ever actually set these fields to true problem or idea because we aren t setting these fields ever we also aren t filtering our api s responses on that field this means that the front end is showing results including all of the data we ve surveyed not just the data we ve processed there will be data that we surveyed and decided not to process along with data that we do want to process but haven t done so yet those kinds of data should not be exposed to users via the front end although our api should support querying for them solution or next step modify all of the processors to set is processed to true for every sample they process make sure the api sample endpoints support filtering on this field additionally and this could be tricky we need to find a way to enable filtering the experiment endpoints to only return experiments which have had all of their samples processed new issue checklist the title is short and descriptive you have explained the context that led you to write this issue you have reported a problem or idea you have proposed a solution or next step
1
61,042
14,599,420,770
IssuesEvent
2020-12-21 04:08:28
doamatto/phone-passcode-gen
https://api.github.com/repos/doamatto/phone-passcode-gen
closed
CVE-2019-6283 (Medium) detected in opennmsopennms-source-26.0.0-1, node-sass-4.14.1.tgz
security vulnerability
## CVE-2019-6283 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>opennmsopennms-source-26.0.0-1</b>, <b>node-sass-4.14.1.tgz</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: phone-passcode-gen/package.json</p> <p>Path to vulnerable library: phone-passcode-gen/node_modules/gulp-sass/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - gulp-sass-4.1.0.tgz (Root Library) - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/doamatto/phone-passcode-gen/commit/9ddf2695e14fb4e1ed3b0dcbb49693b394383c4e">9ddf2695e14fb4e1ed3b0dcbb49693b394383c4e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::parenthese_scope in prelexer.hpp. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6283>CVE-2019-6283</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-6283 (Medium) detected in opennmsopennms-source-26.0.0-1, node-sass-4.14.1.tgz - ## CVE-2019-6283 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>opennmsopennms-source-26.0.0-1</b>, <b>node-sass-4.14.1.tgz</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: phone-passcode-gen/package.json</p> <p>Path to vulnerable library: phone-passcode-gen/node_modules/gulp-sass/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - gulp-sass-4.1.0.tgz (Root Library) - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/doamatto/phone-passcode-gen/commit/9ddf2695e14fb4e1ed3b0dcbb49693b394383c4e">9ddf2695e14fb4e1ed3b0dcbb49693b394383c4e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::parenthese_scope in prelexer.hpp. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6283>CVE-2019-6283</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in opennmsopennms source node sass tgz cve medium severity vulnerability vulnerable libraries opennmsopennms source node sass tgz node sass tgz wrapper around libsass library home page a href path to dependency file phone passcode gen package json path to vulnerable library phone passcode gen node modules gulp sass node modules node sass package json dependency hierarchy gulp sass tgz root library x node sass tgz vulnerable library found in head commit a href found in base branch master vulnerability details in libsass a heap based buffer over read exists in sass prelexer parenthese scope in prelexer hpp publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource
0
230,267
18,527,822,394
IssuesEvent
2021-10-20 23:25:09
rclone/rclone
https://api.github.com/repos/rclone/rclone
closed
Warn about un-normalised UTF-8, or normalise it
enhancement needs retest encoding
Denormalised UTF-8 uploaded from OS X by other tools than rclone is causing problems for OS X users. Options are * stop denormalising the UTF-8 * warn about denormalised UTF-8 * normalise the UTF-8 in the sync routines Follow up from #1431
1.0
Warn about un-normalised UTF-8, or normalise it - Denormalised UTF-8 uploaded from OS X by other tools than rclone is causing problems for OS X users. Options are * stop denormalising the UTF-8 * warn about denormalised UTF-8 * normalise the UTF-8 in the sync routines Follow up from #1431
non_process
warn about un normalised utf or normalise it denormalised utf uploaded from os x by other tools than rclone is causing problems for os x users options are stop denormalising the utf warn about denormalised utf normalise the utf in the sync routines follow up from
0
829,857
31,926,631,129
IssuesEvent
2023-09-19 02:38:06
vmware/singleton
https://api.github.com/repos/vmware/singleton
closed
[BUG] [Go Service]400 Bad Request when request non exist region with svg type
priority/medium
**Describe the bug** [BUG] [Go Service]400 Bad Request when request non exist region with svg type **To Reproduce** Steps to reproduce the behavior: 1. Go to '[countryflag-api](http://10.186.134.67:9091/i18n/api/v2/swagger/index.html#/countryflag-api)' 2. Input region=ko scale=1 type=svg 3. See error <img width="1085" alt="image" src="https://github.com/vmware/singleton/assets/82644009/387d8b2b-3ec6-44c5-a125-6c28aea20209"> **Expected behavior** { "response": { "code": 400, "message": "Current request region 'ko' is not support.", "serverTime": "" }, "signature": "", "data": null } bfa3e1d5d36b9f98a4382d3b3ecd60f4d81209d3
1.0
[BUG] [Go Service]400 Bad Request when request non exist region with svg type - **Describe the bug** [BUG] [Go Service]400 Bad Request when request non exist region with svg type **To Reproduce** Steps to reproduce the behavior: 1. Go to '[countryflag-api](http://10.186.134.67:9091/i18n/api/v2/swagger/index.html#/countryflag-api)' 2. Input region=ko scale=1 type=svg 3. See error <img width="1085" alt="image" src="https://github.com/vmware/singleton/assets/82644009/387d8b2b-3ec6-44c5-a125-6c28aea20209"> **Expected behavior** { "response": { "code": 400, "message": "Current request region 'ko' is not support.", "serverTime": "" }, "signature": "", "data": null } bfa3e1d5d36b9f98a4382d3b3ecd60f4d81209d3
non_process
bad request when request non exist region with svg type describe the bug bad request when request non exist region with svg type to reproduce steps to reproduce the behavior go to input region ko scale type svg see error img width alt image src expected behavior response code message current request region ko is not support servertime signature data null
0
75,104
25,534,090,559
IssuesEvent
2022-11-29 10:37:05
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
closed
Clicking a permalink doesn't scroll it into view
T-Defect X-Regression S-Minor X-Release-Blocker O-Frequent Z-MadLittleMods
### Steps to reproduce 1. Send a message 2. Send 30 other messages 3. Send a reply to the first message 4. Click on the reply ### Outcome #### What did you expect? The timeline to scroll. #### What happened instead? Lack of scrolling, though manually scrolling up reveals the event highlighted. The same bug applies to the "jump to bottom" button. ### Operating system _No response_ ### Application version _No response_ ### How did you install the app? _No response_ ### Homeserver _No response_ ### Will you send logs? No
1.0
Clicking a permalink doesn't scroll it into view - ### Steps to reproduce 1. Send a message 2. Send 30 other messages 3. Send a reply to the first message 4. Click on the reply ### Outcome #### What did you expect? The timeline to scroll. #### What happened instead? Lack of scrolling, though manually scrolling up reveals the event highlighted. The same bug applies to the "jump to bottom" button. ### Operating system _No response_ ### Application version _No response_ ### How did you install the app? _No response_ ### Homeserver _No response_ ### Will you send logs? No
non_process
clicking a permalink doesn t scroll it into view steps to reproduce send a message send other messages send a reply to the first message click on the reply outcome what did you expect the timeline to scroll what happened instead lack of scrolling though manually scrolling up reveals the event highlighted the same bug applies to the jump to bottom button operating system no response application version no response how did you install the app no response homeserver no response will you send logs no
0
11,349
14,170,584,806
IssuesEvent
2020-11-12 14:43:06
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
opened
Coverage is less than 100%
bug process + tools
The check that coverage must be 100% stopped working at some point. I'm not sure if it's related to the numba jit coverage issue (see #77), but I think we should mark numba functions as `no cover` (since there doesn't seem to be a way of getting coverage for these functions, see https://github.com/nedbat/coveragepy/issues/849). We also need to reinstate the mechanism that fails the build if coverage is under 100%, and fix the places not being covered. I'll submit a patch to do all this.
1.0
Coverage is less than 100% - The check that coverage must be 100% stopped working at some point. I'm not sure if it's related to the numba jit coverage issue (see #77), but I think we should mark numba functions as `no cover` (since there doesn't seem to be a way of getting coverage for these functions, see https://github.com/nedbat/coveragepy/issues/849). We also need to reinstate the mechanism that fails the build if coverage is under 100%, and fix the places not being covered. I'll submit a patch to do all this.
process
coverage is less than the check that coverage must be stopped working at some point i m not sure if it s related to the numba jit coverage issue see but i think we should mark numba functions as no cover since there doesn t seem to be a way of getting coverage for these functions see we also need to reinstate the mechanism that fails the build if coverage is under and fix the places not being covered i ll submit a patch to do all this
1
364,573
10,765,984,966
IssuesEvent
2019-11-01 12:37:59
kiwicom/schemathesis
https://api.github.com/repos/kiwicom/schemathesis
closed
Broken compatibility with Hypothesis 4.42.4
Priority: Critical Type: Bug
There was a bug in hypothesis, which is fixed in the recent version - https://github.com/HypothesisWorks/hypothesis/commit/6d5527716c11a776b57b732ea286c1f18bd0eea0
1.0
Broken compatibility with Hypothesis 4.42.4 - There was a bug in hypothesis, which is fixed in the recent version - https://github.com/HypothesisWorks/hypothesis/commit/6d5527716c11a776b57b732ea286c1f18bd0eea0
non_process
broken compatibility with hypothesis there was a bug in hypothesis which is fixed in the recent version
0
17,893
23,868,471,847
IssuesEvent
2022-09-07 13:06:13
spyrales/shinygouv
https://api.github.com/repos/spyrales/shinygouv
closed
[explo-processus] Indiquer ce que serait la méthode de mise à jour avec {bslib}
explo-processus
ETQ futur mainteneurs et utilisateurs, je souhaite savoir ce que cela implique d'intégrer une mise à jour esthétique du DSE dans une app shiny existante si la méthode est {bslib}, afin de choisir la méthode adaptée ## Critères d'acceptation Un rapport / vignette avec : - [x] La présentation de deux composantes shiny existantes, une simple et une plus complexe (cf. le tableau de [Exploration du processus de mise à jour](https://github.com/spyrales/shinygouv/milestone/4)) - [x] Choisir les mêmes que https://github.com/spyrales/shinygouv/issues/9 si possible - [ ] ETQ mainteneur, Je sais comment modifier le CSS fourni par le DSE pour qu'il soit adapté à mes composants Shiny existantes avec l'option A: https://spyrales.github.io/shinygouv/articles/recommandation-pour-l-implementation-de-dsfr.html#option-a-shiny-bslib - [ ] ETQ mainteneur, Je sais la complexité de modifier automatiquement ce CSS dans le package {shinygouv} avec un code R (pas forcément à développer dès maintenant) pour le jour où le fichier source est mis à jour côté DSE avec l'option A ![image](https://user-images.githubusercontent.com/21193866/180023761-10b171d5-e842-4d66-8f80-cb555dea7187.png) - [x] ETQ utilisateur, Je sais la complexité de passer le template d'une application Shiny déjà en production vers l'utilisation de {shinygouv} avec l'option A (même capture d'écran) ![image](https://user-images.githubusercontent.com/21193866/180023796-43278939-5477-41b4-84c2-5783890c95f1.png) - [ ] ETQ mainteneur, Je sais comment modifier le CSS fourni par le DSE pour qu'il soit adapté à mes composants Shiny existantes avec l'option B: https://spyrales.github.io/shinygouv/articles/recommandation-pour-l-implementation-de-dsfr.html#option-b-shiny-et-replace_me - [ ] ETQ mainteneur, Je sais la complexité de modifier automatiquement ce CSS dans le package {shinygouv} avec un code R (pas forcément à développer dès maintenant) pour le jour où le fichier source est mis à jour côté DSE avec l'option B ![image](https://user-images.githubusercontent.com/21193866/180023903-a4b3c542-e2b5-4d60-8d11-3aae7d4adedf.png) - [x] ETQ utilisateur, Je sais la complexité de passer le template d'une application Shiny déjà en production vers l'utilisation de {shinygouv} avec l'option B (même capture d'écran) ![image](https://user-images.githubusercontent.com/21193866/180023908-98dc2c7e-7f8b-41d8-bd76-72fd2a548447.png) ## Comment technique - [ ] Comparer le CSS bootstrap classique VS modifié pour le DSE pour que les composantes soient le plus proche possible du DSE - [ ] Indiquer comment ce serait utliisé avec {bslib} pour être appliqué à une Shiny existante - [ ] Ecrire en toutes lettres, les étapes à réaliser pour la transformation d'une application Shiny existante déjà en prod - [ ] Ecrire en toutes lettres, les étapes à réaliser pour mettre à jour le package si le DSE change (nouvelles polices, nouvelles couleurs, nouveau logos, ...) => Ces informations permettront aux commanditaires de choisir la méthode adaptée à leur besoin et leurs critères de succès à savoir - Eviter d'avoir des fonctions spécifiques pour ne pas gêner le passage d'un template à un autre si possible - Le package doit être maintenable par les utilisateurs et peut être mis à jour rapidement s'il y a des mises à jour des fichiers css/js/polices/images du côté Design System de l'Etat
1.0
[explo-processus] Indiquer ce que serait la méthode de mise à jour avec {bslib} - ETQ futur mainteneurs et utilisateurs, je souhaite savoir ce que cela implique d'intégrer une mise à jour esthétique du DSE dans une app shiny existante si la méthode est {bslib}, afin de choisir la méthode adaptée ## Critères d'acceptation Un rapport / vignette avec : - [x] La présentation de deux composantes shiny existantes, une simple et une plus complexe (cf. le tableau de [Exploration du processus de mise à jour](https://github.com/spyrales/shinygouv/milestone/4)) - [x] Choisir les mêmes que https://github.com/spyrales/shinygouv/issues/9 si possible - [ ] ETQ mainteneur, Je sais comment modifier le CSS fourni par le DSE pour qu'il soit adapté à mes composants Shiny existantes avec l'option A: https://spyrales.github.io/shinygouv/articles/recommandation-pour-l-implementation-de-dsfr.html#option-a-shiny-bslib - [ ] ETQ mainteneur, Je sais la complexité de modifier automatiquement ce CSS dans le package {shinygouv} avec un code R (pas forcément à développer dès maintenant) pour le jour où le fichier source est mis à jour côté DSE avec l'option A ![image](https://user-images.githubusercontent.com/21193866/180023761-10b171d5-e842-4d66-8f80-cb555dea7187.png) - [x] ETQ utilisateur, Je sais la complexité de passer le template d'une application Shiny déjà en production vers l'utilisation de {shinygouv} avec l'option A (même capture d'écran) ![image](https://user-images.githubusercontent.com/21193866/180023796-43278939-5477-41b4-84c2-5783890c95f1.png) - [ ] ETQ mainteneur, Je sais comment modifier le CSS fourni par le DSE pour qu'il soit adapté à mes composants Shiny existantes avec l'option B: https://spyrales.github.io/shinygouv/articles/recommandation-pour-l-implementation-de-dsfr.html#option-b-shiny-et-replace_me - [ ] ETQ mainteneur, Je sais la complexité de modifier automatiquement ce CSS dans le package {shinygouv} avec un code R (pas forcément à développer dès maintenant) pour le jour où le fichier source est mis à jour côté DSE avec l'option B ![image](https://user-images.githubusercontent.com/21193866/180023903-a4b3c542-e2b5-4d60-8d11-3aae7d4adedf.png) - [x] ETQ utilisateur, Je sais la complexité de passer le template d'une application Shiny déjà en production vers l'utilisation de {shinygouv} avec l'option B (même capture d'écran) ![image](https://user-images.githubusercontent.com/21193866/180023908-98dc2c7e-7f8b-41d8-bd76-72fd2a548447.png) ## Comment technique - [ ] Comparer le CSS bootstrap classique VS modifié pour le DSE pour que les composantes soient le plus proche possible du DSE - [ ] Indiquer comment ce serait utliisé avec {bslib} pour être appliqué à une Shiny existante - [ ] Ecrire en toutes lettres, les étapes à réaliser pour la transformation d'une application Shiny existante déjà en prod - [ ] Ecrire en toutes lettres, les étapes à réaliser pour mettre à jour le package si le DSE change (nouvelles polices, nouvelles couleurs, nouveau logos, ...) => Ces informations permettront aux commanditaires de choisir la méthode adaptée à leur besoin et leurs critères de succès à savoir - Eviter d'avoir des fonctions spécifiques pour ne pas gêner le passage d'un template à un autre si possible - Le package doit être maintenable par les utilisateurs et peut être mis à jour rapidement s'il y a des mises à jour des fichiers css/js/polices/images du côté Design System de l'Etat
process
indiquer ce que serait la méthode de mise à jour avec bslib etq futur mainteneurs et utilisateurs je souhaite savoir ce que cela implique d intégrer une mise à jour esthétique du dse dans une app shiny existante si la méthode est bslib afin de choisir la méthode adaptée critères d acceptation un rapport vignette avec la présentation de deux composantes shiny existantes une simple et une plus complexe cf le tableau de choisir les mêmes que si possible etq mainteneur je sais comment modifier le css fourni par le dse pour qu il soit adapté à mes composants shiny existantes avec l option a etq mainteneur je sais la complexité de modifier automatiquement ce css dans le package shinygouv avec un code r pas forcément à développer dès maintenant pour le jour où le fichier source est mis à jour côté dse avec l option a etq utilisateur je sais la complexité de passer le template d une application shiny déjà en production vers l utilisation de shinygouv avec l option a même capture d écran etq mainteneur je sais comment modifier le css fourni par le dse pour qu il soit adapté à mes composants shiny existantes avec l option b etq mainteneur je sais la complexité de modifier automatiquement ce css dans le package shinygouv avec un code r pas forcément à développer dès maintenant pour le jour où le fichier source est mis à jour côté dse avec l option b etq utilisateur je sais la complexité de passer le template d une application shiny déjà en production vers l utilisation de shinygouv avec l option b même capture d écran comment technique comparer le css bootstrap classique vs modifié pour le dse pour que les composantes soient le plus proche possible du dse indiquer comment ce serait utliisé avec bslib pour être appliqué à une shiny existante ecrire en toutes lettres les étapes à réaliser pour la transformation d une application shiny existante déjà en prod ecrire en toutes lettres les étapes à réaliser pour mettre à jour le package si le dse change nouvelles polices nouvelles couleurs nouveau logos ces informations permettront aux commanditaires de choisir la méthode adaptée à leur besoin et leurs critères de succès à savoir eviter d avoir des fonctions spécifiques pour ne pas gêner le passage d un template à un autre si possible le package doit être maintenable par les utilisateurs et peut être mis à jour rapidement s il y a des mises à jour des fichiers css js polices images du côté design system de l etat
1
11,085
13,928,351,594
IssuesEvent
2020-10-21 21:19:20
googleapis/google-auth-library-nodejs
https://api.github.com/repos/googleapis/google-auth-library-nodejs
opened
fix documentation related to authentication strategy
type: docs type: process
Related to #1084 and #983 How, and in what order of operations, are we authenticating in this library? We have a document describing how environment variables, constructors, etc., operate together, but what about order of operations within the constructor? For example, where do `keyfile`, `credentials`, and`clientOptions` fit in against the wider order of operations for authentication? When should we be trying to get a projectId?
1.0
fix documentation related to authentication strategy - Related to #1084 and #983 How, and in what order of operations, are we authenticating in this library? We have a document describing how environment variables, constructors, etc., operate together, but what about order of operations within the constructor? For example, where do `keyfile`, `credentials`, and`clientOptions` fit in against the wider order of operations for authentication? When should we be trying to get a projectId?
process
fix documentation related to authentication strategy related to and how and in what order of operations are we authenticating in this library we have a document describing how environment variables constructors etc operate together but what about order of operations within the constructor for example where do keyfile credentials and clientoptions fit in against the wider order of operations for authentication when should we be trying to get a projectid
1
4,849
4,681,962,963
IssuesEvent
2016-10-09 02:00:45
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
reopened
Speed up list-wach in API Server side
area/performance component/controller-manager team/CSI-API Machinery SIG
The performance of list-watch takes an effect on the real-time event perceiving of kubernetes client side, such as scheduler, controller manager, kubelet etc. I have some ideas about speeding up list-watch in API Server side. 1. As @hongchaodeng metioned [here](https://github.com/kubernetes/kubernetes/blob/master/pkg/storage/etcd3/watcher.go#L212), we can use etcd newest API(support prevKv in PUT/DELETE operations) to avoid extra Get() when transform etcd event. AFAIK, we can't introduce the API until cores/rkt upgrades its grpc package of its api service. However, good news is that rkt has upgraded it about a week ago. So, it's time to introduce etcd's newest feature? 2. Concurrently transform event to speed up list-watch server side processing event. We can spawn multiple goroutines to do the [procedure](https://github.com/kubernetes/kubernetes/blob/master/pkg/storage/etcd3/watcher.go#L189) concurrently. Of course, we should keep the order. What's your option? @hongchaodeng @xiang90 @wojtek-t
True
Speed up list-wach in API Server side - The performance of list-watch takes an effect on the real-time event perceiving of kubernetes client side, such as scheduler, controller manager, kubelet etc. I have some ideas about speeding up list-watch in API Server side. 1. As @hongchaodeng metioned [here](https://github.com/kubernetes/kubernetes/blob/master/pkg/storage/etcd3/watcher.go#L212), we can use etcd newest API(support prevKv in PUT/DELETE operations) to avoid extra Get() when transform etcd event. AFAIK, we can't introduce the API until cores/rkt upgrades its grpc package of its api service. However, good news is that rkt has upgraded it about a week ago. So, it's time to introduce etcd's newest feature? 2. Concurrently transform event to speed up list-watch server side processing event. We can spawn multiple goroutines to do the [procedure](https://github.com/kubernetes/kubernetes/blob/master/pkg/storage/etcd3/watcher.go#L189) concurrently. Of course, we should keep the order. What's your option? @hongchaodeng @xiang90 @wojtek-t
non_process
speed up list wach in api server side the performance of list watch takes an effect on the real time event perceiving of kubernetes client side such as scheduler controller manager kubelet etc i have some ideas about speeding up list watch in api server side as hongchaodeng metioned we can use etcd newest api support prevkv in put delete operations to avoid extra get when transform etcd event afaik we can t introduce the api until cores rkt upgrades its grpc package of its api service however good news is that rkt has upgraded it about a week ago so it s time to introduce etcd s newest feature concurrently transform event to speed up list watch server side processing event we can spawn multiple goroutines to do the concurrently of course we should keep the order what s your option hongchaodeng wojtek t
0
12,404
14,912,329,453
IssuesEvent
2021-01-22 12:30:05
alphagov/govuk-design-system
https://api.github.com/repos/alphagov/govuk-design-system
closed
Test issue
process 🕔 hours
## What Creating a test issue ## Why To demonstrate how to move a card to the sprint board ## Who needs to know about this Delivery Manager, Content Designer ## Further detail ## Done when - [x] Open issue - [x] Add assignees - [x] Add labels - [x] Add to sprint board - [x] Move to done - [x] Close
1.0
Test issue - ## What Creating a test issue ## Why To demonstrate how to move a card to the sprint board ## Who needs to know about this Delivery Manager, Content Designer ## Further detail ## Done when - [x] Open issue - [x] Add assignees - [x] Add labels - [x] Add to sprint board - [x] Move to done - [x] Close
process
test issue what creating a test issue why to demonstrate how to move a card to the sprint board who needs to know about this delivery manager content designer further detail done when open issue add assignees add labels add to sprint board move to done close
1
9,682
12,683,079,567
IssuesEvent
2020-06-19 18:52:50
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Processes created using spawn with a pipe have no /dev/stdin
child_process
<!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: v8.9.4 * **Platform**: Linux 4.9.0-6-amd64 #1 SMP Debian 4.9.82-1+deb9u3 (2018-03-02) x86_64 GNU/Linux * **Subsystem**: child_process <!-- Enter your issue details below this comment. --> The code below fails with `cat: /dev/stdin: No such device or address`. ```js const {spawn} = require('child_process'); const proc = spawn('cat', ['/dev/stdin'], {stdio: ['pipe', process.stdout, process.stderr]}); ``` Apparently Node spawns sub processes in a way that there is no `/dev/stdin` allocated. I think this is a NodeJS bug - on UNIX, a sub process should be able to open `/dev/stdin` if there is a stdin for it. The following Go program for example works perfectly fine: ```go func main() { cmd := exec.Command("cat", "/dev/stdin") cmd.Stderr = os.Stderr cmd.Stdout = os.Stdout stdin, _ := cmd.StdinPipe() go func() { stdin.Write([]byte("hello cat!\n")) stdin.Close() }() cmd.Run() } ```
1.0
Processes created using spawn with a pipe have no /dev/stdin - <!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: v8.9.4 * **Platform**: Linux 4.9.0-6-amd64 #1 SMP Debian 4.9.82-1+deb9u3 (2018-03-02) x86_64 GNU/Linux * **Subsystem**: child_process <!-- Enter your issue details below this comment. --> The code below fails with `cat: /dev/stdin: No such device or address`. ```js const {spawn} = require('child_process'); const proc = spawn('cat', ['/dev/stdin'], {stdio: ['pipe', process.stdout, process.stderr]}); ``` Apparently Node spawns sub processes in a way that there is no `/dev/stdin` allocated. I think this is a NodeJS bug - on UNIX, a sub process should be able to open `/dev/stdin` if there is a stdin for it. The following Go program for example works perfectly fine: ```go func main() { cmd := exec.Command("cat", "/dev/stdin") cmd.Stderr = os.Stderr cmd.Stdout = os.Stdout stdin, _ := cmd.StdinPipe() go func() { stdin.Write([]byte("hello cat!\n")) stdin.Close() }() cmd.Run() } ```
process
processes created using spawn with a pipe have no dev stdin thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform linux smp debian gnu linux subsystem child process the code below fails with cat dev stdin no such device or address js const spawn require child process const proc spawn cat stdio apparently node spawns sub processes in a way that there is no dev stdin allocated i think this is a nodejs bug on unix a sub process should be able to open dev stdin if there is a stdin for it the following go program for example works perfectly fine go func main cmd exec command cat dev stdin cmd stderr os stderr cmd stdout os stdout stdin cmd stdinpipe go func stdin write byte hello cat n stdin close cmd run
1
18,499
24,551,136,697
IssuesEvent
2022-10-12 12:42:10
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
iOS > Create passcode is displayed twice for new user
Bug P1 iOS Process: Fixed Process: Tested dev
**Steps:** 1. Install the app 2. Signup 3. Enter verification code 4. Create a passcode and re-enter the passcode 5. Click on 'Next' and observe AR: Create passcode is displayed twice after the user signs up ER: Create passcode should be displayed only once
2.0
iOS > Create passcode is displayed twice for new user - **Steps:** 1. Install the app 2. Signup 3. Enter verification code 4. Create a passcode and re-enter the passcode 5. Click on 'Next' and observe AR: Create passcode is displayed twice after the user signs up ER: Create passcode should be displayed only once
process
ios create passcode is displayed twice for new user steps install the app signup enter verification code create a passcode and re enter the passcode click on next and observe ar create passcode is displayed twice after the user signs up er create passcode should be displayed only once
1
77,486
15,555,782,043
IssuesEvent
2021-03-16 06:45:32
jozseftiborcz/sast-eval-springboot1
https://api.github.com/repos/jozseftiborcz/sast-eval-springboot1
opened
CVE-2016-1000027 (High) detected in spring-web-5.2.2.RELEASE.jar
security vulnerability
## CVE-2016-1000027 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-web-5.2.2.RELEASE.jar</b></p></summary> <p>Spring Web</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: sast-eval-springboot1/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/5.2.2.RELEASE/spring-web-5.2.2.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-web/5.2.2.RELEASE/spring-web-5.2.2.RELEASE.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.2.2.RELEASE.jar (Root Library) - :x: **spring-web-5.2.2.RELEASE.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jozseftiborcz/sast-eval-springboot1/commit/6be495c42af9c125397cbc9d4bbb4c6b9f7de18c">6be495c42af9c125397cbc9d4bbb4c6b9f7de18c</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Pivotal Spring Framework 4.1.4 suffers from a potential remote code execution (RCE) issue if used for Java deserialization of untrusted data. Depending on how the library is implemented within a product, this issue may or not occur, and authentication may be required. <p>Publish Date: 2020-01-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-1000027>CVE-2016-1000027</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/spring-projects/spring-framework/issues/25379">https://github.com/spring-projects/spring-framework/issues/25379</a></p> <p>Release Date: 2020-01-02</p> <p>Fix Resolution: org.springframework:spring-web:5.3.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2016-1000027 (High) detected in spring-web-5.2.2.RELEASE.jar - ## CVE-2016-1000027 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-web-5.2.2.RELEASE.jar</b></p></summary> <p>Spring Web</p> <p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p> <p>Path to dependency file: sast-eval-springboot1/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/5.2.2.RELEASE/spring-web-5.2.2.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-web/5.2.2.RELEASE/spring-web-5.2.2.RELEASE.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.2.2.RELEASE.jar (Root Library) - :x: **spring-web-5.2.2.RELEASE.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jozseftiborcz/sast-eval-springboot1/commit/6be495c42af9c125397cbc9d4bbb4c6b9f7de18c">6be495c42af9c125397cbc9d4bbb4c6b9f7de18c</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Pivotal Spring Framework 4.1.4 suffers from a potential remote code execution (RCE) issue if used for Java deserialization of untrusted data. Depending on how the library is implemented within a product, this issue may or not occur, and authentication may be required. <p>Publish Date: 2020-01-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-1000027>CVE-2016-1000027</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/spring-projects/spring-framework/issues/25379">https://github.com/spring-projects/spring-framework/issues/25379</a></p> <p>Release Date: 2020-01-02</p> <p>Fix Resolution: org.springframework:spring-web:5.3.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in spring web release jar cve high severity vulnerability vulnerable library spring web release jar spring web library home page a href path to dependency file sast eval pom xml path to vulnerable library home wss scanner repository org springframework spring web release spring web release jar home wss scanner repository org springframework spring web release spring web release jar dependency hierarchy spring boot starter web release jar root library x spring web release jar vulnerable library found in head commit a href found in base branch master vulnerability details pivotal spring framework suffers from a potential remote code execution rce issue if used for java deserialization of untrusted data depending on how the library is implemented within a product this issue may or not occur and authentication may be required publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring web step up your open source security game with whitesource
0
105,164
9,037,368,060
IssuesEvent
2019-02-09 09:50:45
shaarli/Shaarli
https://api.github.com/repos/shaarli/Shaarli
closed
Open Graph meta tags not added on permalink
bug feedback needed please test template
I noticed something strange when trying to investigate this issue on Shaarli Material theme https://github.com/kalvn/Shaarli-Material/issues/88 With a fresh install, with default theme, when I open the permalink of a link, I cannot see og meta tags. ![image](https://user-images.githubusercontent.com/6149640/48058050-c7aaf480-e1b5-11e8-9e47-84f2c51594e9.png) But when I switch to material theme, on the exact same URL I can see them: ![image](https://user-images.githubusercontent.com/6149640/48057820-45bacb80-e1b5-11e8-831e-ba0e2b9360f8.png) What is weird is that I basically copied your implementation as-is, I cannot see any notable difference. And even weirded, if I check for example [this Shaarli](https://links.hoa.ro/?El5Epw) which uses default theme, og tags are present. Any idea?
1.0
Open Graph meta tags not added on permalink - I noticed something strange when trying to investigate this issue on Shaarli Material theme https://github.com/kalvn/Shaarli-Material/issues/88 With a fresh install, with default theme, when I open the permalink of a link, I cannot see og meta tags. ![image](https://user-images.githubusercontent.com/6149640/48058050-c7aaf480-e1b5-11e8-9e47-84f2c51594e9.png) But when I switch to material theme, on the exact same URL I can see them: ![image](https://user-images.githubusercontent.com/6149640/48057820-45bacb80-e1b5-11e8-831e-ba0e2b9360f8.png) What is weird is that I basically copied your implementation as-is, I cannot see any notable difference. And even weirded, if I check for example [this Shaarli](https://links.hoa.ro/?El5Epw) which uses default theme, og tags are present. Any idea?
non_process
open graph meta tags not added on permalink i noticed something strange when trying to investigate this issue on shaarli material theme with a fresh install with default theme when i open the permalink of a link i cannot see og meta tags but when i switch to material theme on the exact same url i can see them what is weird is that i basically copied your implementation as is i cannot see any notable difference and even weirded if i check for example which uses default theme og tags are present any idea
0
21,232
28,321,769,008
IssuesEvent
2023-04-11 02:10:46
vnphanquang/svelte-put
https://api.github.com/repos/vnphanquang/svelte-put
closed
[preprocess-inline-svg] support `class:name` syntax
priority:medium scope:preprocess-inline-svg
## Context Currently the conditional class shorthand [class:name](https://svelte.dev/docs#template-syntax-element-directives-class-name) is not parsed after transformation by `preprocess-inline-svg`. This issue tracks how (or whether it is possible) to resolve this issue. Potentially, this means syntax that includes `:` in themselves might suffer from the same issue (more testing needed)?
1.0
[preprocess-inline-svg] support `class:name` syntax - ## Context Currently the conditional class shorthand [class:name](https://svelte.dev/docs#template-syntax-element-directives-class-name) is not parsed after transformation by `preprocess-inline-svg`. This issue tracks how (or whether it is possible) to resolve this issue. Potentially, this means syntax that includes `:` in themselves might suffer from the same issue (more testing needed)?
process
support class name syntax context currently the conditional class shorthand is not parsed after transformation by preprocess inline svg this issue tracks how or whether it is possible to resolve this issue potentially this means syntax that includes in themselves might suffer from the same issue more testing needed
1
4,025
6,960,760,130
IssuesEvent
2017-12-08 05:52:46
bisq-network/exchange
https://api.github.com/repos/bisq-network/exchange
opened
Find a solution for more resiliance with price relays
a: feature Bounty task Trade process
We are very dependent on BitcoinAverage as the only Fiat price provider. If there are issues (as we just faced) all percentage based prices cannot be taken. We need secondary price sources and either fall back to that in case we have issues with BitcoinAverage or mix it. Thought there is some complexity as all price relays need to have the same source as otherwise take offer attempts can fail if the price what the maker and the taker sees is above a small tolerance window. E.g. If one peer gets the price from a relay which gets it from BitcoinAverage and the other peer is connected to a relay which is connected to coinmarketcap the price will be likely different and exceeds the tolerance window. So it need some creative solution how to deal with those issues.
1.0
Find a solution for more resiliance with price relays - We are very dependent on BitcoinAverage as the only Fiat price provider. If there are issues (as we just faced) all percentage based prices cannot be taken. We need secondary price sources and either fall back to that in case we have issues with BitcoinAverage or mix it. Thought there is some complexity as all price relays need to have the same source as otherwise take offer attempts can fail if the price what the maker and the taker sees is above a small tolerance window. E.g. If one peer gets the price from a relay which gets it from BitcoinAverage and the other peer is connected to a relay which is connected to coinmarketcap the price will be likely different and exceeds the tolerance window. So it need some creative solution how to deal with those issues.
process
find a solution for more resiliance with price relays we are very dependent on bitcoinaverage as the only fiat price provider if there are issues as we just faced all percentage based prices cannot be taken we need secondary price sources and either fall back to that in case we have issues with bitcoinaverage or mix it thought there is some complexity as all price relays need to have the same source as otherwise take offer attempts can fail if the price what the maker and the taker sees is above a small tolerance window e g if one peer gets the price from a relay which gets it from bitcoinaverage and the other peer is connected to a relay which is connected to coinmarketcap the price will be likely different and exceeds the tolerance window so it need some creative solution how to deal with those issues
1
14,515
17,611,818,176
IssuesEvent
2021-08-18 03:06:01
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
WebPack Compilation Error on macOS Big Sur 11.4 on newly generated project
npm: webpack-batteries-included-preprocessor
### Current behavior I created a new folder, ran `yarn add -D cypress`, then openend cypress via `./node_modules/.bin/cypress open` and clicked the `todo.spec.js` which was autogenerated. Cypress threw a WebPack Compile Error: <details> <summary>Stack</summary> <pre> Error: Webpack Compilation Error ./cypress/integration/1-getting-started/todo.spec.js Module build failed (from /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/babel-loader/lib/index.js): TypeError: [BABEL]: Cannot convert undefined or null to object (While processing: /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/@babel/plugin-proposal-class-properties/lib/index.js) at Watching.handle [as handler] (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/@cypress/webpack-preprocessor/dist/index.js:176:23) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:99:9 at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) at Watching._done (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:98:28) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:73:19 at Compiler.emitRecords (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:499:39) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:54:20 at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:485:14 at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:482:27 at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/neo-async/async.js:2818:7 at done (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/neo-async/async.js:3522:9) at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) </pre> </details> This happened on all Cypress Versions until 4.x. I tried the internal Electron Browser and Chromium 94. ### Desired behavior It should run the example test without a compilation error ### Test code to reproduce Create a new folder, run `yarn add -D cypress` and `./node_modules/.bin/cypress open`. Click the `todo.spec.js` ### Cypress Version 8.2.0 ### Other Intel-Based macOS 11.4 (20F71) Node v16.4.2 (which shouldn't matter I guess) yarn as package manager I also removed all Cypress Caches and tried again. Same error.
1.0
WebPack Compilation Error on macOS Big Sur 11.4 on newly generated project - ### Current behavior I created a new folder, ran `yarn add -D cypress`, then openend cypress via `./node_modules/.bin/cypress open` and clicked the `todo.spec.js` which was autogenerated. Cypress threw a WebPack Compile Error: <details> <summary>Stack</summary> <pre> Error: Webpack Compilation Error ./cypress/integration/1-getting-started/todo.spec.js Module build failed (from /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/babel-loader/lib/index.js): TypeError: [BABEL]: Cannot convert undefined or null to object (While processing: /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/@babel/plugin-proposal-class-properties/lib/index.js) at Watching.handle [as handler] (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/@cypress/webpack-preprocessor/dist/index.js:176:23) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:99:9 at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) at Watching._done (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:98:28) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:73:19 at Compiler.emitRecords (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:499:39) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Watching.js:54:20 at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:485:14 at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/webpack/lib/Compiler.js:482:27 at /Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/neo-async/async.js:2818:7 at done (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/neo-async/async.js:3522:9) at AsyncSeriesHook.eval [as callAsync] (eval at create (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:6:1) at AsyncSeriesHook.lazyCompileHook (/Users/tom/Library/Caches/Cypress/8.2.0/Cypress.app/Contents/Resources/app/packages/server/node_modules/tapable/lib/Hook.js:154:20) </pre> </details> This happened on all Cypress Versions until 4.x. I tried the internal Electron Browser and Chromium 94. ### Desired behavior It should run the example test without a compilation error ### Test code to reproduce Create a new folder, run `yarn add -D cypress` and `./node_modules/.bin/cypress open`. Click the `todo.spec.js` ### Cypress Version 8.2.0 ### Other Intel-Based macOS 11.4 (20F71) Node v16.4.2 (which shouldn't matter I guess) yarn as package manager I also removed all Cypress Caches and tried again. Same error.
process
webpack compilation error on macos big sur on newly generated project current behavior i created a new folder ran yarn add d cypress then openend cypress via node modules bin cypress open and clicked the todo spec js which was autogenerated cypress threw a webpack compile error stack error webpack compilation error cypress integration getting started todo spec js module build failed from users tom library caches cypress cypress app contents resources app packages server node modules babel loader lib index js typeerror cannot convert undefined or null to object while processing users tom library caches cypress cypress app contents resources app packages server node modules babel plugin proposal class properties lib index js at watching handle users tom library caches cypress cypress app contents resources app packages server node modules cypress webpack preprocessor dist index js at users tom library caches cypress cypress app contents resources app packages server node modules webpack lib watching js at asyncserieshook eval eval at create users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hookcodefactory js at asyncserieshook lazycompilehook users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hook js at watching done users tom library caches cypress cypress app contents resources app packages server node modules webpack lib watching js at users tom library caches cypress cypress app contents resources app packages server node modules webpack lib watching js at compiler emitrecords users tom library caches cypress cypress app contents resources app packages server node modules webpack lib compiler js at users tom library caches cypress cypress app contents resources app packages server node modules webpack lib watching js at users tom library caches cypress cypress app contents resources app packages server node modules webpack lib compiler js at asyncserieshook eval eval at create users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hookcodefactory js at asyncserieshook lazycompilehook users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hook js at users tom library caches cypress cypress app contents resources app packages server node modules webpack lib compiler js at users tom library caches cypress cypress app contents resources app packages server node modules neo async async js at done users tom library caches cypress cypress app contents resources app packages server node modules neo async async js at asyncserieshook eval eval at create users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hookcodefactory js at asyncserieshook lazycompilehook users tom library caches cypress cypress app contents resources app packages server node modules tapable lib hook js this happened on all cypress versions until x i tried the internal electron browser and chromium desired behavior it should run the example test without a compilation error test code to reproduce create a new folder run yarn add d cypress and node modules bin cypress open click the todo spec js cypress version other intel based macos node which shouldn t matter i guess yarn as package manager i also removed all cypress caches and tried again same error
1
8,819
11,936,834,153
IssuesEvent
2020-04-02 11:02:36
GoogleCloudPlatform/dotnet-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/dotnet-docs-samples
opened
Dlp: TestTriggers test is failing on CI
api: dlp priority: p1 type: process
Test is failing with this error: > Error: Grpc.Core.RpcException : Status(StatusCode=InvalidArgument, Detail="Invalid built-in info type name "US_ZIP".") I haven't been able to reproduce it on local. Skipping the test for now. @ace-n assigning to you since you were the last to significantly modify this test. Let me know if I can help.
1.0
Dlp: TestTriggers test is failing on CI - Test is failing with this error: > Error: Grpc.Core.RpcException : Status(StatusCode=InvalidArgument, Detail="Invalid built-in info type name "US_ZIP".") I haven't been able to reproduce it on local. Skipping the test for now. @ace-n assigning to you since you were the last to significantly modify this test. Let me know if I can help.
process
dlp testtriggers test is failing on ci test is failing with this error error grpc core rpcexception status statuscode invalidargument detail invalid built in info type name us zip i haven t been able to reproduce it on local skipping the test for now ace n assigning to you since you were the last to significantly modify this test let me know if i can help
1
65,822
19,712,374,107
IssuesEvent
2022-01-13 07:24:46
primefaces/primereact
https://api.github.com/repos/primefaces/primereact
closed
DataTable: ReferenceError: process is not defined regression in 7.1
defect
**I'm submitting a ...** (check one with "x") ``` [x] bug report [ ] feature request [ ] support request => Please do not submit support request here, instead see https://forum.primefaces.org/viewforum.php?f=57 ``` **Current behavior** <!-- Describe how the bug manifests. --> Rendering a route with a DataTable component results in the following error being displayed: ReferenceError: process is not defined at Function.createInlineStyle (http://localhost:3000/build/routes/index-KMD6POSB.js:1734:19) at DataTable2.createResponsiveStyle (http://localhost:3000/build/routes/index-KMD6POSB.js:16092:50) at DataTable2.componentDidMount (http://localhost:3000/build/routes/index-KMD6POSB.js:16613:14) at commitLifeCycles (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:14962:30) at commitLayoutEffects (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:16711:15) at HTMLUnknownElement.callCallback2 (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3675:22) at Object.invokeGuardedCallbackDev (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3700:24) at invokeGuardedCallback (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3734:39) at commitRootImpl (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:16533:17) at unstable_runWithPriority (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:346:20) This is a regression introduced in 7.1, seemingly by #2423 . Version 7.0 works fine. **Expected behavior** No crash. **Minimal reproduction of the problem with instructions** [PrimeReactBug.zip](https://github.com/primefaces/primereact/files/7739878/PrimeReactBug.zip) Simply run "npm install" then "npm run dev" and open localhost;3000 in the browser to observe the error. **Please tell us about your environment:** <!-- Operating system, IDE, package manager, HTTP server, ... --> Windows 10, VS Code. * **React version:** 17.0.2 * **PrimeReact version:** Regression in 7.1 * **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ] Chrome 96, Firefox 95 * **Language:** [all | TypeScript X.X | ES6/7 | ES5] * Typescript 4.1.2
1.0
DataTable: ReferenceError: process is not defined regression in 7.1 - **I'm submitting a ...** (check one with "x") ``` [x] bug report [ ] feature request [ ] support request => Please do not submit support request here, instead see https://forum.primefaces.org/viewforum.php?f=57 ``` **Current behavior** <!-- Describe how the bug manifests. --> Rendering a route with a DataTable component results in the following error being displayed: ReferenceError: process is not defined at Function.createInlineStyle (http://localhost:3000/build/routes/index-KMD6POSB.js:1734:19) at DataTable2.createResponsiveStyle (http://localhost:3000/build/routes/index-KMD6POSB.js:16092:50) at DataTable2.componentDidMount (http://localhost:3000/build/routes/index-KMD6POSB.js:16613:14) at commitLifeCycles (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:14962:30) at commitLayoutEffects (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:16711:15) at HTMLUnknownElement.callCallback2 (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3675:22) at Object.invokeGuardedCallbackDev (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3700:24) at invokeGuardedCallback (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:3734:39) at commitRootImpl (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:16533:17) at unstable_runWithPriority (http://localhost:3000/build/_shared/chunk-B6UQWYWJ.js:346:20) This is a regression introduced in 7.1, seemingly by #2423 . Version 7.0 works fine. **Expected behavior** No crash. **Minimal reproduction of the problem with instructions** [PrimeReactBug.zip](https://github.com/primefaces/primereact/files/7739878/PrimeReactBug.zip) Simply run "npm install" then "npm run dev" and open localhost;3000 in the browser to observe the error. **Please tell us about your environment:** <!-- Operating system, IDE, package manager, HTTP server, ... --> Windows 10, VS Code. * **React version:** 17.0.2 * **PrimeReact version:** Regression in 7.1 * **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ] Chrome 96, Firefox 95 * **Language:** [all | TypeScript X.X | ES6/7 | ES5] * Typescript 4.1.2
non_process
datatable referenceerror process is not defined regression in i m submitting a check one with x bug report feature request support request please do not submit support request here instead see current behavior rendering a route with a datatable component results in the following error being displayed referenceerror process is not defined at function createinlinestyle at createresponsivestyle at componentdidmount at commitlifecycles at commitlayouteffects at htmlunknownelement at object invokeguardedcallbackdev at invokeguardedcallback at commitrootimpl at unstable runwithpriority this is a regression introduced in seemingly by version works fine expected behavior no crash minimal reproduction of the problem with instructions simply run npm install then npm run dev and open localhost in the browser to observe the error please tell us about your environment windows vs code react version primereact version regression in browser chrome firefox language typescript
0
139,661
12,877,651,864
IssuesEvent
2020-07-11 12:21:59
mariamihai/udemy-sbm-beer-service
https://api.github.com/repos/mariamihai/udemy-sbm-beer-service
closed
Jackson and Lombok issues documented
documentation
Under the **225. Refactor Model to Common Package** clip, the model used by the services is set to a common package. This way, all the services have the model in the same named package. Instead of this, I opted to update the Jackson configuration and add the mapping to the messageConverter. This way, the services are completely independent from one another. Currently, this is done only for the Inventory Service, for the NewInventoryEvent class. This should be improved as needed along the way instead of adding the mapping for all the classes under model right now. Each service will be treated individually and modified as needed along the way. Another issue is using the @Builder annotation from Lombok with inheritance. There might be some issues along the way. The solution used in the course was to flatten the classes. Instead of this action, @SuperBuilder annotation might be used instead. If issues are encountered, more investigation should be done. Some possible information: [here](https://www.baeldung.com/lombok-builder-inheritance) and [here](https://stackoverflow.com/questions/49212930/lombok-builder-inheritance-simultaneously-working-for-parent-and-child).
1.0
Jackson and Lombok issues documented - Under the **225. Refactor Model to Common Package** clip, the model used by the services is set to a common package. This way, all the services have the model in the same named package. Instead of this, I opted to update the Jackson configuration and add the mapping to the messageConverter. This way, the services are completely independent from one another. Currently, this is done only for the Inventory Service, for the NewInventoryEvent class. This should be improved as needed along the way instead of adding the mapping for all the classes under model right now. Each service will be treated individually and modified as needed along the way. Another issue is using the @Builder annotation from Lombok with inheritance. There might be some issues along the way. The solution used in the course was to flatten the classes. Instead of this action, @SuperBuilder annotation might be used instead. If issues are encountered, more investigation should be done. Some possible information: [here](https://www.baeldung.com/lombok-builder-inheritance) and [here](https://stackoverflow.com/questions/49212930/lombok-builder-inheritance-simultaneously-working-for-parent-and-child).
non_process
jackson and lombok issues documented under the refactor model to common package clip the model used by the services is set to a common package this way all the services have the model in the same named package instead of this i opted to update the jackson configuration and add the mapping to the messageconverter this way the services are completely independent from one another currently this is done only for the inventory service for the newinventoryevent class this should be improved as needed along the way instead of adding the mapping for all the classes under model right now each service will be treated individually and modified as needed along the way another issue is using the builder annotation from lombok with inheritance there might be some issues along the way the solution used in the course was to flatten the classes instead of this action superbuilder annotation might be used instead if issues are encountered more investigation should be done some possible information and
0
8,051
11,220,789,648
IssuesEvent
2020-01-07 16:28:30
aiidateam/aiida-core
https://api.github.com/repos/aiidateam/aiida-core
opened
Error when loading checkpoint
topic/processes type/bug
After moving from ``1.0.0`` to the latest ``develop``, I'm getting errors where workchains except because their checkpoint could not be loaded (see full traceback below). These are _new_ processes, i.e. they did not exist before the code update. At least once, the issue occurred right after the daemon was restarted. This might just be the trigger for actually having to dump / load the checkpoint, though. I have tried isolating a minimal failing example, but without luck so far. Will update once I've got an example. @sphuber, could this be related to the switch from the YAML ``FullLoader`` to ``SafeLoader`` ? ``` 2020-01-07 15:35:38 [44621 | ERROR]: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 124, in load_checkpoint bundle = serialize.deserialize(checkpoint) File "/data/aiida/source/aiida-core/aiida/orm/utils/serialize.py", line 230, in deserialize return yaml.load(serialized, Loader=AiiDALoader) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/__init__.py", line 114, in load return loader.get_single_data() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 43, in get_single_data return self.construct_document(node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 52, in construct_document for dummy in generator: File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 596, in construct_python_object state = self.construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 210, in construct_mapping return super().construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 135, in construct_mapping value = self.construct_object(value_node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 92, in construct_object data = constructor(self, node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 420, in construct_undefined node.start_mark) yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/object/apply: aiida.engine.processes.workchains.awaitable.AwaitableAction' in "<unicode string>", line 87, column 11: action: !!python/object/apply:aiida.engi ... ^ During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/manage/external/rmq.py", line 187, in _continue result = yield super()._continue(communicator, pid, nowait, tag) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/plumpy/process_comms.py", line 546, in _continue saved_state = self._persister.load_checkpoint(pid, tag) File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 127, in load_checkpoint 'Failed to load the checkpoint for process<{}>: {}'.format(pid, traceback.format_exc()) plumpy.exceptions.PersistenceError: Failed to load the checkpoint for process<55902>: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 124, in load_checkpoint bundle = serialize.deserialize(checkpoint) File "/data/aiida/source/aiida-core/aiida/orm/utils/serialize.py", line 230, in deserialize return yaml.load(serialized, Loader=AiiDALoader) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/__init__.py", line 114, in load return loader.get_single_data() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 43, in get_single_data return self.construct_document(node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 52, in construct_document for dummy in generator: File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 596, in construct_python_object state = self.construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 210, in construct_mapping return super().construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 135, in construct_mapping value = self.construct_object(value_node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 92, in construct_object data = constructor(self, node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 420, in construct_undefined node.start_mark) yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/object/apply:aiida.engine.processes.workchains.awaitable.AwaitableAction' in "<unicode string>", line 87, column 11: action: !!python/object/apply:aiida.engi ... ```
1.0
Error when loading checkpoint - After moving from ``1.0.0`` to the latest ``develop``, I'm getting errors where workchains except because their checkpoint could not be loaded (see full traceback below). These are _new_ processes, i.e. they did not exist before the code update. At least once, the issue occurred right after the daemon was restarted. This might just be the trigger for actually having to dump / load the checkpoint, though. I have tried isolating a minimal failing example, but without luck so far. Will update once I've got an example. @sphuber, could this be related to the switch from the YAML ``FullLoader`` to ``SafeLoader`` ? ``` 2020-01-07 15:35:38 [44621 | ERROR]: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 124, in load_checkpoint bundle = serialize.deserialize(checkpoint) File "/data/aiida/source/aiida-core/aiida/orm/utils/serialize.py", line 230, in deserialize return yaml.load(serialized, Loader=AiiDALoader) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/__init__.py", line 114, in load return loader.get_single_data() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 43, in get_single_data return self.construct_document(node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 52, in construct_document for dummy in generator: File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 596, in construct_python_object state = self.construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 210, in construct_mapping return super().construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 135, in construct_mapping value = self.construct_object(value_node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 92, in construct_object data = constructor(self, node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 420, in construct_undefined node.start_mark) yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/object/apply: aiida.engine.processes.workchains.awaitable.AwaitableAction' in "<unicode string>", line 87, column 11: action: !!python/object/apply:aiida.engi ... ^ During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/manage/external/rmq.py", line 187, in _continue result = yield super()._continue(communicator, pid, nowait, tag) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/plumpy/process_comms.py", line 546, in _continue saved_state = self._persister.load_checkpoint(pid, tag) File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 127, in load_checkpoint 'Failed to load the checkpoint for process<{}>: {}'.format(pid, traceback.format_exc()) plumpy.exceptions.PersistenceError: Failed to load the checkpoint for process<55902>: Traceback (most recent call last): File "/data/aiida/source/aiida-core/aiida/engine/persistence.py", line 124, in load_checkpoint bundle = serialize.deserialize(checkpoint) File "/data/aiida/source/aiida-core/aiida/orm/utils/serialize.py", line 230, in deserialize return yaml.load(serialized, Loader=AiiDALoader) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/__init__.py", line 114, in load return loader.get_single_data() File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 43, in get_single_data return self.construct_document(node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 52, in construct_document for dummy in generator: File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 596, in construct_python_object state = self.construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 210, in construct_mapping return super().construct_mapping(node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 135, in construct_mapping value = self.construct_object(value_node, deep=deep) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 92, in construct_object data = constructor(self, node) File "/home/a-dogres/.virtualenvs/aiida/lib/python3.6/site-packages/yaml/constructor.py", line 420, in construct_undefined node.start_mark) yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/object/apply:aiida.engine.processes.workchains.awaitable.AwaitableAction' in "<unicode string>", line 87, column 11: action: !!python/object/apply:aiida.engi ... ```
process
error when loading checkpoint after moving from to the latest develop i m getting errors where workchains except because their checkpoint could not be loaded see full traceback below these are new processes i e they did not exist before the code update at least once the issue occurred right after the daemon was restarted this might just be the trigger for actually having to dump load the checkpoint though i have tried isolating a minimal failing example but without luck so far will update once i ve got an example sphuber could this be related to the switch from the yaml fullloader to safeloader traceback most recent call last file data aiida source aiida core aiida engine persistence py line in load checkpoint bundle serialize deserialize checkpoint file data aiida source aiida core aiida orm utils serialize py line in deserialize return yaml load serialized loader aiidaloader file home a dogres virtualenvs aiida lib site packages yaml init py line in load return loader get single data file home a dogres virtualenvs aiida lib site packages yaml constructor py line in get single data return self construct document node file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct document for dummy in generator file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct python object state self construct mapping node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct mapping return super construct mapping node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct mapping value self construct object value node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct object data constructor self node file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct undefined node start mark yaml constructor constructorerror could not determine a constructor for the tag tag yaml org python object apply aiida engine processes workchains awaitable awaitableaction in line column action python object apply aiida engi during handling of the above exception another exception occurred traceback most recent call last file data aiida source aiida core aiida manage external rmq py line in continue result yield super continue communicator pid nowait tag file home a dogres virtualenvs aiida lib site packages tornado gen py line in run value future result file home a dogres virtualenvs aiida lib site packages tornado concurrent py line in result raise exc info self exc info file line in raise exc info file home a dogres virtualenvs aiida lib site packages tornado gen py line in wrapper yielded next result file home a dogres virtualenvs aiida lib site packages plumpy process comms py line in continue saved state self persister load checkpoint pid tag file data aiida source aiida core aiida engine persistence py line in load checkpoint failed to load the checkpoint for process format pid traceback format exc plumpy exceptions persistenceerror failed to load the checkpoint for process traceback most recent call last file data aiida source aiida core aiida engine persistence py line in load checkpoint bundle serialize deserialize checkpoint file data aiida source aiida core aiida orm utils serialize py line in deserialize return yaml load serialized loader aiidaloader file home a dogres virtualenvs aiida lib site packages yaml init py line in load return loader get single data file home a dogres virtualenvs aiida lib site packages yaml constructor py line in get single data return self construct document node file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct document for dummy in generator file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct python object state self construct mapping node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct mapping return super construct mapping node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct mapping value self construct object value node deep deep file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct object data constructor self node file home a dogres virtualenvs aiida lib site packages yaml constructor py line in construct undefined node start mark yaml constructor constructorerror could not determine a constructor for the tag tag yaml org python object apply aiida engine processes workchains awaitable awaitableaction in line column action python object apply aiida engi
1
180,196
14,741,259,191
IssuesEvent
2021-01-07 10:20:40
crytic/building-secure-contracts
https://api.github.com/repos/crytic/building-secure-contracts
closed
Echidna: Add crytic integration walkthrough
Echidna documentation
https://crytic.io/ supports now Echidna, we need a walkthrough example on how it works.
1.0
Echidna: Add crytic integration walkthrough - https://crytic.io/ supports now Echidna, we need a walkthrough example on how it works.
non_process
echidna add crytic integration walkthrough supports now echidna we need a walkthrough example on how it works
0
17,236
22,959,073,979
IssuesEvent
2022-07-19 14:01:23
GabrielRPalma/DeepGenerativeModelling
https://api.github.com/repos/GabrielRPalma/DeepGenerativeModelling
closed
Studying Natural Language Processing
Python Natural Language Processing Literature Review
Here, I am going to study more methods for implementing autoencoder with text data. Study sources: - DataCamp courses - Research Papers - [Natural Language Processing In action](https://www.bookdepository.com/Natural-Language-Processing-in-Action-Lane-Hobson/9781617294631?redirected=true&utm_medium=Google&utm_campaign=Base3&utm_source=IE&utm_content=Natural-Language-Processing-in-Action&selectCurrency=EUR&w=AFFPAU96SQHJQ1A8VT5J&gclid=CjwKCAjw7vuUBhBUEiwAEdu2pOwAt7j6Qt1r-f4vlFlLTz1zso1VCQsb6q8T6COvvDzu0kwXa-2cYRoCrKMQAvD_BwE)
1.0
Studying Natural Language Processing - Here, I am going to study more methods for implementing autoencoder with text data. Study sources: - DataCamp courses - Research Papers - [Natural Language Processing In action](https://www.bookdepository.com/Natural-Language-Processing-in-Action-Lane-Hobson/9781617294631?redirected=true&utm_medium=Google&utm_campaign=Base3&utm_source=IE&utm_content=Natural-Language-Processing-in-Action&selectCurrency=EUR&w=AFFPAU96SQHJQ1A8VT5J&gclid=CjwKCAjw7vuUBhBUEiwAEdu2pOwAt7j6Qt1r-f4vlFlLTz1zso1VCQsb6q8T6COvvDzu0kwXa-2cYRoCrKMQAvD_BwE)
process
studying natural language processing here i am going to study more methods for implementing autoencoder with text data study sources datacamp courses research papers
1
6,872
10,001,458,743
IssuesEvent
2019-07-12 15:40:37
googleapis/google-cloud-cpp-spanner
https://api.github.com/repos/googleapis/google-cloud-cpp-spanner
opened
Consider: Should we "own" github.com/GoogleCloudPlatform/cpp-docs-samples
type: docs type: process type: question
https://github.com/GoogleCloudPlatform/cpp-docs-samples exists. I believe it predates our supported google-cloud-cpp and google-cloud-cpp-spanner repos. The cpp-docs-samples repo appears to be fairly ignored at this point. Should we take "ownership" of it and put our C++ samples in that repo? Should we put only "big" samples in that repo, or all of our samples? Should we deprecate that repo? I'd be curious to understand better how other cloud client libraries use the `<lang>-docs-samples` repos.
1.0
Consider: Should we "own" github.com/GoogleCloudPlatform/cpp-docs-samples - https://github.com/GoogleCloudPlatform/cpp-docs-samples exists. I believe it predates our supported google-cloud-cpp and google-cloud-cpp-spanner repos. The cpp-docs-samples repo appears to be fairly ignored at this point. Should we take "ownership" of it and put our C++ samples in that repo? Should we put only "big" samples in that repo, or all of our samples? Should we deprecate that repo? I'd be curious to understand better how other cloud client libraries use the `<lang>-docs-samples` repos.
process
consider should we own github com googlecloudplatform cpp docs samples exists i believe it predates our supported google cloud cpp and google cloud cpp spanner repos the cpp docs samples repo appears to be fairly ignored at this point should we take ownership of it and put our c samples in that repo should we put only big samples in that repo or all of our samples should we deprecate that repo i d be curious to understand better how other cloud client libraries use the docs samples repos
1
17,825
23,753,924,612
IssuesEvent
2022-09-01 00:00:38
Azure/azure-sdk-tools
https://api.github.com/repos/Azure/azure-sdk-tools
opened
Code Documentation Generation Tool from Cadl files
Epic Central-EngSys Cadl WS: Process Tools & Automation
Tools that produce code documentation for the SDK team. The documentation comes from Cadl and is published in ms.docs. We currently do this for Swagger files.
1.0
Code Documentation Generation Tool from Cadl files - Tools that produce code documentation for the SDK team. The documentation comes from Cadl and is published in ms.docs. We currently do this for Swagger files.
process
code documentation generation tool from cadl files tools that produce code documentation for the sdk team the documentation comes from cadl and is published in ms docs we currently do this for swagger files
1
22,336
30,928,691,108
IssuesEvent
2023-08-06 20:17:44
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
v0.46.6.4 : Migration from H2 to Postgres fails with Command failed with exception: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id"
Type:Bug .Team/QueryProcessor :hammer_and_wrench:
### Describe the bug Try to migrate meta database from H2 to production database ``` export MB_DB_TYPE=postgres export MB_DB_CONNECTION_URI="jdbc:postgresql://localhost:5432/metabase?user=postgres&password=mysecret" java -jar metabase.jar load-from-h2 ./metabase.db ``` Get the following error: ``` rg.postgresql.util.PSQLException: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2676) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2366) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:356) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:316) at org.postgresql.jdbc.PgConnection.executeTransactionCommand(PgConnection.java:879) at org.postgresql.jdbc.PgConnection.commit(PgConnection.java:901) at com.mchange.v2.c3p0.impl.NewProxyConnection.commit(NewProxyConnection.java:981) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:815) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:852) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:789) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at metabase.cmd.copy$fn__102278$copy_BANG___102283$fn__102284.invoke(copy.clj:391) at metabase.cmd.copy$fn__102278$copy_BANG___102283.invoke(copy.clj:368) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invokeStatic(load_from_h2.clj:36) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invoke(load_from_h2.clj:26) at clojure.lang.Var.invoke(Var.java:384) at metabase.cmd$load_from_h2.invokeStatic(cmd.clj:48) at metabase.cmd$load_from_h2.invoke(cmd.clj:42) at clojure.lang.AFn.applyToHelper(AFn.java:154) at clojure.lang.AFn.applyTo(AFn.java:144) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.cmd$run_cmd$fn__103139.invoke(cmd.clj:264) at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264) at metabase.cmd$run_cmd.invoke(cmd.clj:255) at clojure.lang.Var.invoke(Var.java:388) at metabase.core$run_cmd.invokeStatic(core.clj:166) at metabase.core$run_cmd.invoke(core.clj:164) at metabase.core$_main.invokeStatic(core.clj:188) at metabase.core$_main.doInvoke(core.clj:183) at clojure.lang.RestFn.applyTo(RestFn.java:137) at clojure.lang.Var.applyTo(Var.java:705) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31) at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28) at clojure.lang.RestFn.applyTo(RestFn.java:137) at metabase.bootstrap.main(Unknown Source) ``` Note that this is a clean install. I have only downloaded the jar file and started it once with connection to a postgres database. So the Postgres schema has been sync'ed once. Then I shutdown the metabase instance and immediately started the migration. ### To Reproduce 1. Download metabase jar 2. Open web-front-end in browser 3. follow setup steps and connect to one single Postgres database 4. Wait until the postgres schema is sync'ed 5. Shutdown metabase 6. Start migration as described above ### Expected behavior I expect the migration to finish without error ### Logs I can no longer start metabase after the failed migration. However I can provide my terminal output from the failed migration: ``` % java -jar metabase.jar load-from-h2 ./metabase.db 2023-08-02 13:00:44,989 INFO metabase.util :: Maximum memory available to JVM: 4,0 GB 2023-08-02 13:00:46,944 INFO util.encryption :: Saved credentials encryption is DISABLED for this Metabase instance. 🔓 For more information, see https://metabase.com/docs/latest/operations-guide/encrypting-database-details-at-rest.html ..instrumented #'metabase.util.malli/with-api-error-message ..instrumented #'metabase.util.honey-sql-2/identifier ..instrumented #'metabase.util.honey-sql-2/normalize-type-info ..instrumented #'metabase.util.honey-sql-2/with-database-type-info ..instrumented #'metabase.util.honey-sql-2/cast ..instrumented #'metabase.util.honey-sql-2/quoted-cast ..instrumented #'metabase.util.honey-sql-2/maybe-cast ..instrumented #'metabase.models.permissions/classify-path ..instrumented #'metabase.models.permissions/classify-data-path ..instrumented #'metabase.models.permissions/generate-graph ..instrumented #'metabase.models.permissions/->v2-path ..instrumented #'metabase.models.permissions/update-db-data-access-permissions! ..instrumented #'metabase.models.permissions/update-group-permissions! ..instrumented #'metabase.models.permissions/update-data-perms-graph! ..instrumented #'metabase.models.parameter-card/upsert-or-delete-from-parameters! WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.math, being replaced by: #'kixi.stats.math/abs WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.test, being replaced by: #'kixi.stats.math/abs WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.distribution, being replaced by: #'kixi.stats.math/abs ..instrumented #'metabase.query-processor.middleware.permissions/check-query-permissions* ..instrumented #'metabase.driver.sql-jdbc.sync.describe-database/simple-select-probe-query 2023-08-02 13:00:52,557 INFO driver.impl :: Registered abstract driver :sql 🚚 2023-08-02 13:00:52,560 INFO metabase.util :: ⮦ Load driver :sql took 88,6 ms 2023-08-02 13:00:52,564 INFO driver.impl :: Registered abstract driver :sql-jdbc (parents: [:sql]) 🚚 2023-08-02 13:00:52,568 INFO metabase.util :: Load driver :sql-jdbc took 99,9 ms 2023-08-02 13:00:52,568 INFO driver.impl :: Registered driver :h2 (parents: [:sql-jdbc]) 🚚 ..instrumented #'metabase.driver.h2/classify-query 2023-08-02 13:00:52,615 INFO driver.impl :: Registered driver :mysql (parents: [:sql-jdbc]) 🚚 2023-08-02 13:00:52,644 INFO driver.impl :: Registered driver :postgres (parents: [:sql-jdbc]) 🚚 ..instrumented #'metabase.models.params.custom-values/values-from-card ..instrumented #'metabase.api.card/param-values ..instrumented #'metabase.api.dashboard/chain-filter ..instrumented #'metabase.api.setup/state-for-checklist ..instrumented #'metabase.api.setup/checklist-items 2023-08-02 13:00:54,447 INFO metabase.core :: Metabase v0.46.6.4 (7c60aca release-x.46.6.x) Copyright © 2023 Metabase, Inc. Metabase Enterprise Edition extensions are NOT PRESENT. 2023-08-02 13:00:54,504 INFO cmd.copy :: Set up h2 source database and run migrations... 2023-08-02 13:00:54,507 INFO db.setup :: Verifying h2 Database Connection ... 2023-08-02 13:00:54,909 INFO db.setup :: Successfully verified H2 2.1.212 (2022-04-09) application database connection. ✅ 2023-08-02 13:00:54,910 INFO db.setup :: Running Database Migrations... 2023-08-02 13:00:54,912 INFO db.setup :: Setting up Liquibase... 2023-08-02 13:00:55,151 INFO db.setup :: Liquibase is ready. 2023-08-02 13:00:55,152 INFO db.liquibase :: Checking if Database has unrun migrations... 2023-08-02 13:00:56,172 INFO db.setup :: Database Migrations Current ... ✅ 2023-08-02 13:00:56,173 INFO db.data-migrations :: Running all necessary data migrations, this may take a minute. 2023-08-02 13:00:56,662 INFO db.data-migrations :: Finished running data migrations. 2023-08-02 13:00:56,663 INFO metabase.util :: Database setup took 2,2 s 2023-08-02 13:00:56,663 INFO cmd.copy :: [OK] 2023-08-02 13:00:56,664 INFO cmd.copy :: Set up postgres target database and run migrations... 2023-08-02 13:00:56,665 INFO db.setup :: Verifying postgres Database Connection ... 2023-08-02 13:00:57,068 INFO db.setup :: Successfully verified PostgreSQL 15.1 (Debian 15.1-1.pgdg110+1) application database connection. ✅ 2023-08-02 13:00:57,068 INFO db.setup :: Running Database Migrations... 2023-08-02 13:00:57,068 INFO db.setup :: Setting up Liquibase... 2023-08-02 13:00:57,106 INFO db.setup :: Liquibase is ready. 2023-08-02 13:00:57,106 INFO db.liquibase :: Checking if Database has unrun migrations... 2023-08-02 13:00:57,977 INFO db.liquibase :: Database has unrun migrations. Waiting for migration lock to be cleared... 2023-08-02 13:00:58,148 INFO db.liquibase :: Migration lock is cleared. Running migrations... 2023-08-02 13:01:07,198 INFO impl.StdSchedulerFactory :: Using default implementation for ThreadExecutor 2023-08-02 13:01:07,215 INFO core.SchedulerSignalerImpl :: Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl 2023-08-02 13:01:07,215 INFO core.QuartzScheduler :: Quartz Scheduler v.2.3.2 created. 2023-08-02 13:01:07,216 INFO jdbcjobstore.JobStoreTX :: Using db table-based data access locking (synchronization). 2023-08-02 13:01:07,217 INFO jdbcjobstore.JobStoreTX :: JobStoreTX initialized. 2023-08-02 13:01:07,218 INFO core.QuartzScheduler :: Scheduler meta-data: Quartz Scheduler (v2.3.2) 'MetabaseScheduler' with instanceId 'BodosMac13-3.fritz.box1690974067204' Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally. NOT STARTED. Currently in standby mode. Number of jobs executed: 0 Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads. Using job-store 'org.quartz.impl.jdbcjobstore.JobStoreTX' - which supports persistence. and is clustered. 2023-08-02 13:01:07,218 INFO impl.StdSchedulerFactory :: Quartz scheduler 'MetabaseScheduler' initialized from default resource file in Quartz package: 'quartz.properties' 2023-08-02 13:01:07,218 INFO impl.StdSchedulerFactory :: Quartz scheduler version: 2.3.2 2023-08-02 13:01:07,252 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 started. 2023-08-02 13:01:07,298 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 shutting down. 2023-08-02 13:01:07,299 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 paused. 2023-08-02 13:01:07,299 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 shutdown complete. 2023-08-02 13:01:07,380 INFO db.setup :: Database Migrations Current ... ✅ 2023-08-02 13:01:07,380 INFO metabase.util :: Database setup took 10,7 s 2023-08-02 13:01:07,381 INFO cmd.copy :: [OK] 2023-08-02 13:01:07,382 INFO cmd.copy :: Testing if target postgres database is already populated... 2023-08-02 13:01:07,385 INFO cmd.copy :: [OK] 2023-08-02 13:01:07,385 INFO cmd.copy :: Clearing default entries created by Liquibase migrations... 2023-08-02 13:01:07,387 INFO cmd.copy :: Temporarily disabling DB constraints... 2023-08-02 13:01:07,938 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,614 INFO cmd.copy :: Re-enabling DB constraints... 2023-08-02 13:01:09,614 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,615 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,618 INFO cmd.copy :: Temporarily disabling DB constraints... 2023-08-02 13:01:09,806 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,813 INFO cmd.copy :: Copying instances of Database... 2023-08-02 13:01:09,822 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:09,823 INFO cmd.copy :: Copying instances of User... 2023-08-02 13:01:09,827 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:09,828 INFO cmd.copy :: Copying instances of Setting... 2023-08-02 13:01:09,834 INFO cmd.copy :: copied 12 instances. 2023-08-02 13:01:09,842 INFO cmd.copy :: Copying instances of Table... 2023-08-02 13:01:09,873 INFO cmd.copy :: copied 115 instances. 2023-08-02 13:01:09,923 INFO cmd.copy :: Copying instances of Field... 2023-08-02 13:01:10,371 INFO cmd.copy :: copied 2,521 instances. 2023-08-02 13:01:10,378 INFO cmd.copy :: Copying instances of FieldValues... 2023-08-02 13:01:10,389 INFO cmd.copy :: copied 24 instances. 2023-08-02 13:01:10,393 INFO cmd.copy :: Copying instances of Session... 2023-08-02 13:01:10,396 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:10,398 INFO cmd.copy :: Copying instances of Collection... 2023-08-02 13:01:10,401 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:10,406 INFO cmd.copy :: Copying instances of Activity... 2023-08-02 13:01:10,414 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,417 INFO cmd.copy :: Copying instances of PermissionsGroup... 2023-08-02 13:01:10,423 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,424 INFO cmd.copy :: Copying instances of PermissionsGroupMembership... 2023-08-02 13:01:10,430 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,431 INFO cmd.copy :: Copying instances of Permissions... 2023-08-02 13:01:10,437 INFO cmd.copy :: copied 10 instances. 2023-08-02 13:01:10,441 INFO cmd.copy :: Copying instances of DataMigrations... 2023-08-02 13:01:10,447 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,448 INFO cmd.copy :: Re-enabling DB constraints... 2023-08-02 13:01:10,448 INFO cmd.copy :: [OK] org.postgresql.util.PSQLException: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2676) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2366) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:356) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:316) at org.postgresql.jdbc.PgConnection.executeTransactionCommand(PgConnection.java:879) at org.postgresql.jdbc.PgConnection.commit(PgConnection.java:901) at com.mchange.v2.c3p0.impl.NewProxyConnection.commit(NewProxyConnection.java:981) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:815) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:852) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:789) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at metabase.cmd.copy$fn__102278$copy_BANG___102283$fn__102284.invoke(copy.clj:391) at metabase.cmd.copy$fn__102278$copy_BANG___102283.invoke(copy.clj:368) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invokeStatic(load_from_h2.clj:36) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invoke(load_from_h2.clj:26) at clojure.lang.Var.invoke(Var.java:384) at metabase.cmd$load_from_h2.invokeStatic(cmd.clj:48) at metabase.cmd$load_from_h2.invoke(cmd.clj:42) at clojure.lang.AFn.applyToHelper(AFn.java:154) at clojure.lang.AFn.applyTo(AFn.java:144) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.cmd$run_cmd$fn__103139.invoke(cmd.clj:264) at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264) at metabase.cmd$run_cmd.invoke(cmd.clj:255) at clojure.lang.Var.invoke(Var.java:388) at metabase.core$run_cmd.invokeStatic(core.clj:166) at metabase.core$run_cmd.invoke(core.clj:164) at metabase.core$_main.invokeStatic(core.clj:188) at metabase.core$_main.doInvoke(core.clj:183) at clojure.lang.RestFn.applyTo(RestFn.java:137) at clojure.lang.Var.applyTo(Var.java:705) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31) at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28) at clojure.lang.RestFn.applyTo(RestFn.java:137) at metabase.bootstrap.main(Unknown Source) Command failed with exception: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". ``` ### Information about your Metabase installation ```JSON After migration I can no longer open Admin->Troubleshooting Metabase v0.46.6.4 (7c60aca release-x.46.6.x) macOS ``` ### Severity it is blocking my usage entirely ### Additional context _No response_
1.0
v0.46.6.4 : Migration from H2 to Postgres fails with Command failed with exception: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" - ### Describe the bug Try to migrate meta database from H2 to production database ``` export MB_DB_TYPE=postgres export MB_DB_CONNECTION_URI="jdbc:postgresql://localhost:5432/metabase?user=postgres&password=mysecret" java -jar metabase.jar load-from-h2 ./metabase.db ``` Get the following error: ``` rg.postgresql.util.PSQLException: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2676) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2366) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:356) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:316) at org.postgresql.jdbc.PgConnection.executeTransactionCommand(PgConnection.java:879) at org.postgresql.jdbc.PgConnection.commit(PgConnection.java:901) at com.mchange.v2.c3p0.impl.NewProxyConnection.commit(NewProxyConnection.java:981) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:815) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:852) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:789) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at metabase.cmd.copy$fn__102278$copy_BANG___102283$fn__102284.invoke(copy.clj:391) at metabase.cmd.copy$fn__102278$copy_BANG___102283.invoke(copy.clj:368) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invokeStatic(load_from_h2.clj:36) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invoke(load_from_h2.clj:26) at clojure.lang.Var.invoke(Var.java:384) at metabase.cmd$load_from_h2.invokeStatic(cmd.clj:48) at metabase.cmd$load_from_h2.invoke(cmd.clj:42) at clojure.lang.AFn.applyToHelper(AFn.java:154) at clojure.lang.AFn.applyTo(AFn.java:144) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.cmd$run_cmd$fn__103139.invoke(cmd.clj:264) at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264) at metabase.cmd$run_cmd.invoke(cmd.clj:255) at clojure.lang.Var.invoke(Var.java:388) at metabase.core$run_cmd.invokeStatic(core.clj:166) at metabase.core$run_cmd.invoke(core.clj:164) at metabase.core$_main.invokeStatic(core.clj:188) at metabase.core$_main.doInvoke(core.clj:183) at clojure.lang.RestFn.applyTo(RestFn.java:137) at clojure.lang.Var.applyTo(Var.java:705) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31) at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28) at clojure.lang.RestFn.applyTo(RestFn.java:137) at metabase.bootstrap.main(Unknown Source) ``` Note that this is a clean install. I have only downloaded the jar file and started it once with connection to a postgres database. So the Postgres schema has been sync'ed once. Then I shutdown the metabase instance and immediately started the migration. ### To Reproduce 1. Download metabase jar 2. Open web-front-end in browser 3. follow setup steps and connect to one single Postgres database 4. Wait until the postgres schema is sync'ed 5. Shutdown metabase 6. Start migration as described above ### Expected behavior I expect the migration to finish without error ### Logs I can no longer start metabase after the failed migration. However I can provide my terminal output from the failed migration: ``` % java -jar metabase.jar load-from-h2 ./metabase.db 2023-08-02 13:00:44,989 INFO metabase.util :: Maximum memory available to JVM: 4,0 GB 2023-08-02 13:00:46,944 INFO util.encryption :: Saved credentials encryption is DISABLED for this Metabase instance. 🔓 For more information, see https://metabase.com/docs/latest/operations-guide/encrypting-database-details-at-rest.html ..instrumented #'metabase.util.malli/with-api-error-message ..instrumented #'metabase.util.honey-sql-2/identifier ..instrumented #'metabase.util.honey-sql-2/normalize-type-info ..instrumented #'metabase.util.honey-sql-2/with-database-type-info ..instrumented #'metabase.util.honey-sql-2/cast ..instrumented #'metabase.util.honey-sql-2/quoted-cast ..instrumented #'metabase.util.honey-sql-2/maybe-cast ..instrumented #'metabase.models.permissions/classify-path ..instrumented #'metabase.models.permissions/classify-data-path ..instrumented #'metabase.models.permissions/generate-graph ..instrumented #'metabase.models.permissions/->v2-path ..instrumented #'metabase.models.permissions/update-db-data-access-permissions! ..instrumented #'metabase.models.permissions/update-group-permissions! ..instrumented #'metabase.models.permissions/update-data-perms-graph! ..instrumented #'metabase.models.parameter-card/upsert-or-delete-from-parameters! WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.math, being replaced by: #'kixi.stats.math/abs WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.test, being replaced by: #'kixi.stats.math/abs WARNING: abs already refers to: #'clojure.core/abs in namespace: kixi.stats.distribution, being replaced by: #'kixi.stats.math/abs ..instrumented #'metabase.query-processor.middleware.permissions/check-query-permissions* ..instrumented #'metabase.driver.sql-jdbc.sync.describe-database/simple-select-probe-query 2023-08-02 13:00:52,557 INFO driver.impl :: Registered abstract driver :sql 🚚 2023-08-02 13:00:52,560 INFO metabase.util :: ⮦ Load driver :sql took 88,6 ms 2023-08-02 13:00:52,564 INFO driver.impl :: Registered abstract driver :sql-jdbc (parents: [:sql]) 🚚 2023-08-02 13:00:52,568 INFO metabase.util :: Load driver :sql-jdbc took 99,9 ms 2023-08-02 13:00:52,568 INFO driver.impl :: Registered driver :h2 (parents: [:sql-jdbc]) 🚚 ..instrumented #'metabase.driver.h2/classify-query 2023-08-02 13:00:52,615 INFO driver.impl :: Registered driver :mysql (parents: [:sql-jdbc]) 🚚 2023-08-02 13:00:52,644 INFO driver.impl :: Registered driver :postgres (parents: [:sql-jdbc]) 🚚 ..instrumented #'metabase.models.params.custom-values/values-from-card ..instrumented #'metabase.api.card/param-values ..instrumented #'metabase.api.dashboard/chain-filter ..instrumented #'metabase.api.setup/state-for-checklist ..instrumented #'metabase.api.setup/checklist-items 2023-08-02 13:00:54,447 INFO metabase.core :: Metabase v0.46.6.4 (7c60aca release-x.46.6.x) Copyright © 2023 Metabase, Inc. Metabase Enterprise Edition extensions are NOT PRESENT. 2023-08-02 13:00:54,504 INFO cmd.copy :: Set up h2 source database and run migrations... 2023-08-02 13:00:54,507 INFO db.setup :: Verifying h2 Database Connection ... 2023-08-02 13:00:54,909 INFO db.setup :: Successfully verified H2 2.1.212 (2022-04-09) application database connection. ✅ 2023-08-02 13:00:54,910 INFO db.setup :: Running Database Migrations... 2023-08-02 13:00:54,912 INFO db.setup :: Setting up Liquibase... 2023-08-02 13:00:55,151 INFO db.setup :: Liquibase is ready. 2023-08-02 13:00:55,152 INFO db.liquibase :: Checking if Database has unrun migrations... 2023-08-02 13:00:56,172 INFO db.setup :: Database Migrations Current ... ✅ 2023-08-02 13:00:56,173 INFO db.data-migrations :: Running all necessary data migrations, this may take a minute. 2023-08-02 13:00:56,662 INFO db.data-migrations :: Finished running data migrations. 2023-08-02 13:00:56,663 INFO metabase.util :: Database setup took 2,2 s 2023-08-02 13:00:56,663 INFO cmd.copy :: [OK] 2023-08-02 13:00:56,664 INFO cmd.copy :: Set up postgres target database and run migrations... 2023-08-02 13:00:56,665 INFO db.setup :: Verifying postgres Database Connection ... 2023-08-02 13:00:57,068 INFO db.setup :: Successfully verified PostgreSQL 15.1 (Debian 15.1-1.pgdg110+1) application database connection. ✅ 2023-08-02 13:00:57,068 INFO db.setup :: Running Database Migrations... 2023-08-02 13:00:57,068 INFO db.setup :: Setting up Liquibase... 2023-08-02 13:00:57,106 INFO db.setup :: Liquibase is ready. 2023-08-02 13:00:57,106 INFO db.liquibase :: Checking if Database has unrun migrations... 2023-08-02 13:00:57,977 INFO db.liquibase :: Database has unrun migrations. Waiting for migration lock to be cleared... 2023-08-02 13:00:58,148 INFO db.liquibase :: Migration lock is cleared. Running migrations... 2023-08-02 13:01:07,198 INFO impl.StdSchedulerFactory :: Using default implementation for ThreadExecutor 2023-08-02 13:01:07,215 INFO core.SchedulerSignalerImpl :: Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl 2023-08-02 13:01:07,215 INFO core.QuartzScheduler :: Quartz Scheduler v.2.3.2 created. 2023-08-02 13:01:07,216 INFO jdbcjobstore.JobStoreTX :: Using db table-based data access locking (synchronization). 2023-08-02 13:01:07,217 INFO jdbcjobstore.JobStoreTX :: JobStoreTX initialized. 2023-08-02 13:01:07,218 INFO core.QuartzScheduler :: Scheduler meta-data: Quartz Scheduler (v2.3.2) 'MetabaseScheduler' with instanceId 'BodosMac13-3.fritz.box1690974067204' Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally. NOT STARTED. Currently in standby mode. Number of jobs executed: 0 Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads. Using job-store 'org.quartz.impl.jdbcjobstore.JobStoreTX' - which supports persistence. and is clustered. 2023-08-02 13:01:07,218 INFO impl.StdSchedulerFactory :: Quartz scheduler 'MetabaseScheduler' initialized from default resource file in Quartz package: 'quartz.properties' 2023-08-02 13:01:07,218 INFO impl.StdSchedulerFactory :: Quartz scheduler version: 2.3.2 2023-08-02 13:01:07,252 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 started. 2023-08-02 13:01:07,298 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 shutting down. 2023-08-02 13:01:07,299 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 paused. 2023-08-02 13:01:07,299 INFO core.QuartzScheduler :: Scheduler MetabaseScheduler_$_BodosMac13-3.fritz.box1690974067204 shutdown complete. 2023-08-02 13:01:07,380 INFO db.setup :: Database Migrations Current ... ✅ 2023-08-02 13:01:07,380 INFO metabase.util :: Database setup took 10,7 s 2023-08-02 13:01:07,381 INFO cmd.copy :: [OK] 2023-08-02 13:01:07,382 INFO cmd.copy :: Testing if target postgres database is already populated... 2023-08-02 13:01:07,385 INFO cmd.copy :: [OK] 2023-08-02 13:01:07,385 INFO cmd.copy :: Clearing default entries created by Liquibase migrations... 2023-08-02 13:01:07,387 INFO cmd.copy :: Temporarily disabling DB constraints... 2023-08-02 13:01:07,938 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,614 INFO cmd.copy :: Re-enabling DB constraints... 2023-08-02 13:01:09,614 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,615 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,618 INFO cmd.copy :: Temporarily disabling DB constraints... 2023-08-02 13:01:09,806 INFO cmd.copy :: [OK] 2023-08-02 13:01:09,813 INFO cmd.copy :: Copying instances of Database... 2023-08-02 13:01:09,822 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:09,823 INFO cmd.copy :: Copying instances of User... 2023-08-02 13:01:09,827 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:09,828 INFO cmd.copy :: Copying instances of Setting... 2023-08-02 13:01:09,834 INFO cmd.copy :: copied 12 instances. 2023-08-02 13:01:09,842 INFO cmd.copy :: Copying instances of Table... 2023-08-02 13:01:09,873 INFO cmd.copy :: copied 115 instances. 2023-08-02 13:01:09,923 INFO cmd.copy :: Copying instances of Field... 2023-08-02 13:01:10,371 INFO cmd.copy :: copied 2,521 instances. 2023-08-02 13:01:10,378 INFO cmd.copy :: Copying instances of FieldValues... 2023-08-02 13:01:10,389 INFO cmd.copy :: copied 24 instances. 2023-08-02 13:01:10,393 INFO cmd.copy :: Copying instances of Session... 2023-08-02 13:01:10,396 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:10,398 INFO cmd.copy :: Copying instances of Collection... 2023-08-02 13:01:10,401 INFO cmd.copy :: copied 1 instances. 2023-08-02 13:01:10,406 INFO cmd.copy :: Copying instances of Activity... 2023-08-02 13:01:10,414 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,417 INFO cmd.copy :: Copying instances of PermissionsGroup... 2023-08-02 13:01:10,423 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,424 INFO cmd.copy :: Copying instances of PermissionsGroupMembership... 2023-08-02 13:01:10,430 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,431 INFO cmd.copy :: Copying instances of Permissions... 2023-08-02 13:01:10,437 INFO cmd.copy :: copied 10 instances. 2023-08-02 13:01:10,441 INFO cmd.copy :: Copying instances of DataMigrations... 2023-08-02 13:01:10,447 INFO cmd.copy :: copied 2 instances. 2023-08-02 13:01:10,448 INFO cmd.copy :: Re-enabling DB constraints... 2023-08-02 13:01:10,448 INFO cmd.copy :: [OK] org.postgresql.util.PSQLException: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2676) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2366) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:356) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:316) at org.postgresql.jdbc.PgConnection.executeTransactionCommand(PgConnection.java:879) at org.postgresql.jdbc.PgConnection.commit(PgConnection.java:901) at com.mchange.v2.c3p0.impl.NewProxyConnection.commit(NewProxyConnection.java:981) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:815) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:852) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:789) at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:776) at metabase.cmd.copy$fn__102278$copy_BANG___102283$fn__102284.invoke(copy.clj:391) at metabase.cmd.copy$fn__102278$copy_BANG___102283.invoke(copy.clj:368) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invokeStatic(load_from_h2.clj:36) at metabase.cmd.load_from_h2$load_from_h2_BANG_.invoke(load_from_h2.clj:26) at clojure.lang.Var.invoke(Var.java:384) at metabase.cmd$load_from_h2.invokeStatic(cmd.clj:48) at metabase.cmd$load_from_h2.invoke(cmd.clj:42) at clojure.lang.AFn.applyToHelper(AFn.java:154) at clojure.lang.AFn.applyTo(AFn.java:144) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.cmd$run_cmd$fn__103139.invoke(cmd.clj:264) at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264) at metabase.cmd$run_cmd.invoke(cmd.clj:255) at clojure.lang.Var.invoke(Var.java:388) at metabase.core$run_cmd.invokeStatic(core.clj:166) at metabase.core$run_cmd.invoke(core.clj:164) at metabase.core$_main.invokeStatic(core.clj:188) at metabase.core$_main.doInvoke(core.clj:183) at clojure.lang.RestFn.applyTo(RestFn.java:137) at clojure.lang.Var.applyTo(Var.java:705) at clojure.core$apply.invokeStatic(core.clj:667) at clojure.core$apply.invoke(core.clj:662) at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31) at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28) at clojure.lang.RestFn.applyTo(RestFn.java:137) at metabase.bootstrap.main(Unknown Source) Command failed with exception: ERROR: insert or update on table "metabase_table" violates foreign key constraint "fk_table_ref_database_id" Detail: Key (db_id)=(1) is not present in table "metabase_database". ``` ### Information about your Metabase installation ```JSON After migration I can no longer open Admin->Troubleshooting Metabase v0.46.6.4 (7c60aca release-x.46.6.x) macOS ``` ### Severity it is blocking my usage entirely ### Additional context _No response_
process
migration from to postgres fails with command failed with exception error insert or update on table metabase table violates foreign key constraint fk table ref database id describe the bug try to migrate meta database from to production database export mb db type postgres export mb db connection uri jdbc postgresql localhost metabase user postgres password mysecret java jar metabase jar load from metabase db get the following error rg postgresql util psqlexception error insert or update on table metabase table violates foreign key constraint fk table ref database id detail key db id is not present in table metabase database at org postgresql core queryexecutorimpl receiveerrorresponse queryexecutorimpl java at org postgresql core queryexecutorimpl processresults queryexecutorimpl java at org postgresql core queryexecutorimpl execute queryexecutorimpl java at org postgresql core queryexecutorimpl execute queryexecutorimpl java at org postgresql jdbc pgconnection executetransactioncommand pgconnection java at org postgresql jdbc pgconnection commit pgconnection java at com mchange impl newproxyconnection commit newproxyconnection java at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at metabase cmd copy fn copy bang fn invoke copy clj at metabase cmd copy fn copy bang invoke copy clj at metabase cmd load from load from bang invokestatic load from clj at metabase cmd load from load from bang invoke load from clj at clojure lang var invoke var java at metabase cmd load from invokestatic cmd clj at metabase cmd load from invoke cmd clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase cmd run cmd fn invoke cmd clj at metabase cmd run cmd invokestatic cmd clj at metabase cmd run cmd invoke cmd clj at clojure lang var invoke var java at metabase core run cmd invokestatic core clj at metabase core run cmd invoke core clj at metabase core main invokestatic core clj at metabase core main doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang var applyto var java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase bootstrap main invokestatic bootstrap clj at metabase bootstrap main doinvoke bootstrap clj at clojure lang restfn applyto restfn java at metabase bootstrap main unknown source note that this is a clean install i have only downloaded the jar file and started it once with connection to a postgres database so the postgres schema has been sync ed once then i shutdown the metabase instance and immediately started the migration to reproduce download metabase jar open web front end in browser follow setup steps and connect to one single postgres database wait until the postgres schema is sync ed shutdown metabase start migration as described above expected behavior i expect the migration to finish without error logs i can no longer start metabase after the failed migration however i can provide my terminal output from the failed migration java jar metabase jar load from metabase db info metabase util maximum memory available to jvm gb info util encryption saved credentials encryption is disabled for this metabase instance 🔓 for more information see instrumented metabase util malli with api error message instrumented metabase util honey sql identifier instrumented metabase util honey sql normalize type info instrumented metabase util honey sql with database type info instrumented metabase util honey sql cast instrumented metabase util honey sql quoted cast instrumented metabase util honey sql maybe cast instrumented metabase models permissions classify path instrumented metabase models permissions classify data path instrumented metabase models permissions generate graph instrumented metabase models permissions path instrumented metabase models permissions update db data access permissions instrumented metabase models permissions update group permissions instrumented metabase models permissions update data perms graph instrumented metabase models parameter card upsert or delete from parameters warning abs already refers to clojure core abs in namespace kixi stats math being replaced by kixi stats math abs warning abs already refers to clojure core abs in namespace kixi stats test being replaced by kixi stats math abs warning abs already refers to clojure core abs in namespace kixi stats distribution being replaced by kixi stats math abs instrumented metabase query processor middleware permissions check query permissions instrumented metabase driver sql jdbc sync describe database simple select probe query info driver impl registered abstract driver sql 🚚 info metabase util ⮦ load driver sql took ms info driver impl registered abstract driver sql jdbc parents 🚚 info metabase util load driver sql jdbc took ms info driver impl registered driver parents 🚚 instrumented metabase driver classify query info driver impl registered driver mysql parents 🚚 info driver impl registered driver postgres parents 🚚 instrumented metabase models params custom values values from card instrumented metabase api card param values instrumented metabase api dashboard chain filter instrumented metabase api setup state for checklist instrumented metabase api setup checklist items info metabase core metabase release x x copyright © metabase inc metabase enterprise edition extensions are not present info cmd copy set up source database and run migrations info db setup verifying database connection info db setup successfully verified application database connection ✅ info db setup running database migrations info db setup setting up liquibase info db setup liquibase is ready info db liquibase checking if database has unrun migrations info db setup database migrations current ✅ info db data migrations running all necessary data migrations this may take a minute info db data migrations finished running data migrations info metabase util database setup took s info cmd copy info cmd copy set up postgres target database and run migrations info db setup verifying postgres database connection info db setup successfully verified postgresql debian application database connection ✅ info db setup running database migrations info db setup setting up liquibase info db setup liquibase is ready info db liquibase checking if database has unrun migrations info db liquibase database has unrun migrations waiting for migration lock to be cleared info db liquibase migration lock is cleared running migrations info impl stdschedulerfactory using default implementation for threadexecutor info core schedulersignalerimpl initialized scheduler signaller of type class org quartz core schedulersignalerimpl info core quartzscheduler quartz scheduler v created info jdbcjobstore jobstoretx using db table based data access locking synchronization info jdbcjobstore jobstoretx jobstoretx initialized info core quartzscheduler scheduler meta data quartz scheduler metabasescheduler with instanceid fritz scheduler class org quartz core quartzscheduler running locally not started currently in standby mode number of jobs executed using thread pool org quartz simpl simplethreadpool with threads using job store org quartz impl jdbcjobstore jobstoretx which supports persistence and is clustered info impl stdschedulerfactory quartz scheduler metabasescheduler initialized from default resource file in quartz package quartz properties info impl stdschedulerfactory quartz scheduler version info core quartzscheduler scheduler metabasescheduler fritz started info core quartzscheduler scheduler metabasescheduler fritz shutting down info core quartzscheduler scheduler metabasescheduler fritz paused info core quartzscheduler scheduler metabasescheduler fritz shutdown complete info db setup database migrations current ✅ info metabase util database setup took s info cmd copy info cmd copy testing if target postgres database is already populated info cmd copy info cmd copy clearing default entries created by liquibase migrations info cmd copy temporarily disabling db constraints info cmd copy info cmd copy re enabling db constraints info cmd copy info cmd copy info cmd copy temporarily disabling db constraints info cmd copy info cmd copy copying instances of database info cmd copy copied instances info cmd copy copying instances of user info cmd copy copied instances info cmd copy copying instances of setting info cmd copy copied instances info cmd copy copying instances of table info cmd copy copied instances info cmd copy copying instances of field info cmd copy copied instances info cmd copy copying instances of fieldvalues info cmd copy copied instances info cmd copy copying instances of session info cmd copy copied instances info cmd copy copying instances of collection info cmd copy copied instances info cmd copy copying instances of activity info cmd copy copied instances info cmd copy copying instances of permissionsgroup info cmd copy copied instances info cmd copy copying instances of permissionsgroupmembership info cmd copy copied instances info cmd copy copying instances of permissions info cmd copy copied instances info cmd copy copying instances of datamigrations info cmd copy copied instances info cmd copy re enabling db constraints info cmd copy org postgresql util psqlexception error insert or update on table metabase table violates foreign key constraint fk table ref database id detail key db id is not present in table metabase database at org postgresql core queryexecutorimpl receiveerrorresponse queryexecutorimpl java at org postgresql core queryexecutorimpl processresults queryexecutorimpl java at org postgresql core queryexecutorimpl execute queryexecutorimpl java at org postgresql core queryexecutorimpl execute queryexecutorimpl java at org postgresql jdbc pgconnection executetransactioncommand pgconnection java at org postgresql jdbc pgconnection commit pgconnection java at com mchange impl newproxyconnection commit newproxyconnection java at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at clojure java jdbc db transaction star invokestatic jdbc clj at clojure java jdbc db transaction star invoke jdbc clj at metabase cmd copy fn copy bang fn invoke copy clj at metabase cmd copy fn copy bang invoke copy clj at metabase cmd load from load from bang invokestatic load from clj at metabase cmd load from load from bang invoke load from clj at clojure lang var invoke var java at metabase cmd load from invokestatic cmd clj at metabase cmd load from invoke cmd clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase cmd run cmd fn invoke cmd clj at metabase cmd run cmd invokestatic cmd clj at metabase cmd run cmd invoke cmd clj at clojure lang var invoke var java at metabase core run cmd invokestatic core clj at metabase core run cmd invoke core clj at metabase core main invokestatic core clj at metabase core main doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang var applyto var java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase bootstrap main invokestatic bootstrap clj at metabase bootstrap main doinvoke bootstrap clj at clojure lang restfn applyto restfn java at metabase bootstrap main unknown source command failed with exception error insert or update on table metabase table violates foreign key constraint fk table ref database id detail key db id is not present in table metabase database information about your metabase installation json after migration i can no longer open admin troubleshooting metabase release x x macos severity it is blocking my usage entirely additional context no response
1
30,944
25,190,569,440
IssuesEvent
2022-11-12 00:01:39
microsoft/TypeScript
https://api.github.com/repos/microsoft/TypeScript
closed
To add `cache` support for GitHub Actions using `setup-node`
Infrastructure
# Suggestion To add `cache` support for GitHub Actions using `setup-node` ## 🔍 Search Terms <!-- 💡 Did you know? TypeScript has over 2,000 open suggestions! 🔎 Please search thoroughly before logging new feature requests as most common ideas already have a proposal in progress. The "Common Feature Requests" section of the FAQ lists many popular requests: https://github.com/Microsoft/TypeScript/wiki/FAQ#common-feature-requests Replace the text below: --> List of keywords you searched for before creating this issue. Write them down here so that others can find this suggestion more easily and help provide feedback. `cache`, `setup-node` ## ✅ Viability Checklist <!-- Suggestions that don't meet all these criteria are very, very unlikely to be accepted. We always recommend reviewing the TypeScript design goals before investing time writing a proposal for ideas outside the scope of the project. --> My suggestion meets these guidelines: * [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code * [x] This wouldn't change the runtime behavior of existing JavaScript code * [x] This could be implemented without emitting different JS based on the types of the expressions * [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.) * [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals). ## ⭐ Suggestion <!-- A summary of what you'd like to see added or changed --> ## 📃 Motivating Example `setup-node` GitHub Action just released a new option to add cache to steps using it. You can find the details here: https://github.blog/changelog/2021-07-02-github-actions-setup-node-now-supports-dependency-caching/ ## 💻 Use Cases See https://github.com/microsoft/TypeScript/pull/44897
1.0
To add `cache` support for GitHub Actions using `setup-node` - # Suggestion To add `cache` support for GitHub Actions using `setup-node` ## 🔍 Search Terms <!-- 💡 Did you know? TypeScript has over 2,000 open suggestions! 🔎 Please search thoroughly before logging new feature requests as most common ideas already have a proposal in progress. The "Common Feature Requests" section of the FAQ lists many popular requests: https://github.com/Microsoft/TypeScript/wiki/FAQ#common-feature-requests Replace the text below: --> List of keywords you searched for before creating this issue. Write them down here so that others can find this suggestion more easily and help provide feedback. `cache`, `setup-node` ## ✅ Viability Checklist <!-- Suggestions that don't meet all these criteria are very, very unlikely to be accepted. We always recommend reviewing the TypeScript design goals before investing time writing a proposal for ideas outside the scope of the project. --> My suggestion meets these guidelines: * [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code * [x] This wouldn't change the runtime behavior of existing JavaScript code * [x] This could be implemented without emitting different JS based on the types of the expressions * [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.) * [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals). ## ⭐ Suggestion <!-- A summary of what you'd like to see added or changed --> ## 📃 Motivating Example `setup-node` GitHub Action just released a new option to add cache to steps using it. You can find the details here: https://github.blog/changelog/2021-07-02-github-actions-setup-node-now-supports-dependency-caching/ ## 💻 Use Cases See https://github.com/microsoft/TypeScript/pull/44897
non_process
to add cache support for github actions using setup node suggestion to add cache support for github actions using setup node 🔍 search terms 💡 did you know typescript has over open suggestions 🔎 please search thoroughly before logging new feature requests as most common ideas already have a proposal in progress the common feature requests section of the faq lists many popular requests replace the text below list of keywords you searched for before creating this issue write them down here so that others can find this suggestion more easily and help provide feedback cache setup node ✅ viability checklist suggestions that don t meet all these criteria are very very unlikely to be accepted we always recommend reviewing the typescript design goals before investing time writing a proposal for ideas outside the scope of the project my suggestion meets these guidelines this wouldn t be a breaking change in existing typescript javascript code this wouldn t change the runtime behavior of existing javascript code this could be implemented without emitting different js based on the types of the expressions this isn t a runtime feature e g library functionality non ecmascript syntax with javascript output new syntax sugar for js etc this feature would agree with the rest of ⭐ suggestion 📃 motivating example setup node github action just released a new option to add cache to steps using it you can find the details here 💻 use cases see
0
278,811
24,179,092,604
IssuesEvent
2022-09-23 07:03:45
elastic/apm-server
https://api.github.com/repos/elastic/apm-server
opened
Manual Test Plan 8.5 release
test-plan
When picking up a test case, please add your name to this overview beforehand and tick the checkbox when finished. Testing can be started when the first build candidate (BC) is available. ## Smoke Testing ESS setup * [x] enable metrics collection in the Cloud UI and check APM Server data show up in the Stack Monitoring UI in Kibana (known bug: https://github.com/elastic/apm-server/issues/8383) * [x] The cloud troubleshooting logs and metrics are accessible as expected. Thanks to https://github.com/elastic/apm-server/issues/8303 further smoke tests are run automatically on ESS now. ## Test cases from the github board [apm-server 8.5 test-plan](https://github.com/elastic/apm-server/issues?q=label%3Atest-plan+label%3Av8.5.0) Add yourself as _assignee_ on the PR before you start testing. ## Regressions Link any regressions to this issue.
1.0
Manual Test Plan 8.5 release - When picking up a test case, please add your name to this overview beforehand and tick the checkbox when finished. Testing can be started when the first build candidate (BC) is available. ## Smoke Testing ESS setup * [x] enable metrics collection in the Cloud UI and check APM Server data show up in the Stack Monitoring UI in Kibana (known bug: https://github.com/elastic/apm-server/issues/8383) * [x] The cloud troubleshooting logs and metrics are accessible as expected. Thanks to https://github.com/elastic/apm-server/issues/8303 further smoke tests are run automatically on ESS now. ## Test cases from the github board [apm-server 8.5 test-plan](https://github.com/elastic/apm-server/issues?q=label%3Atest-plan+label%3Av8.5.0) Add yourself as _assignee_ on the PR before you start testing. ## Regressions Link any regressions to this issue.
non_process
manual test plan release when picking up a test case please add your name to this overview beforehand and tick the checkbox when finished testing can be started when the first build candidate bc is available smoke testing ess setup enable metrics collection in the cloud ui and check apm server data show up in the stack monitoring ui in kibana known bug the cloud troubleshooting logs and metrics are accessible as expected thanks to further smoke tests are run automatically on ess now test cases from the github board add yourself as assignee on the pr before you start testing regressions link any regressions to this issue
0
239,968
18,289,288,977
IssuesEvent
2021-10-05 13:42:01
GowthamGoush/Amazing_Sites
https://api.github.com/repos/GowthamGoush/Amazing_Sites
closed
freecodecamp.org
documentation hacktoberfest
Free Code Camp is a great place to start learning new technologies. It is free and open-source and is accessible to anyone. Their Youtube channel has content in all fields and is helpful to all new and experienced developers. Overall, I think it's a site to be featured in this repo.
1.0
freecodecamp.org - Free Code Camp is a great place to start learning new technologies. It is free and open-source and is accessible to anyone. Their Youtube channel has content in all fields and is helpful to all new and experienced developers. Overall, I think it's a site to be featured in this repo.
non_process
freecodecamp org free code camp is a great place to start learning new technologies it is free and open source and is accessible to anyone their youtube channel has content in all fields and is helpful to all new and experienced developers overall i think it s a site to be featured in this repo
0
62,882
26,196,075,473
IssuesEvent
2023-01-03 13:34:56
cityofaustin/atd-data-tech
https://api.github.com/repos/cityofaustin/atd-data-tech
opened
Fix the White Screen of Death (WSOD) for the VZV in 2024
Type: Bug Report Impact: 2-Major Service: Dev Need: 1-Must Have Workgroup: VZ Product: Vision Zero Viewer
Users may experience the White Screen of Death (WSOD) on the Vision Zero Viewer at the beginning of the year. The WSOD was caused because it was missing a [hard-coded population estimate](https://github.com/cityofaustin/atd-vz-data/blob/master/atd-vzv/src/constants/popEsts.js) for the City from the previous year. Update the `popEsts` directly after the new year to prevent the WSOD from persisting. Reference the [PR from last year.](https://github.com/cityofaustin/atd-vz-data/pull/1148) h/t @frankhereford
1.0
Fix the White Screen of Death (WSOD) for the VZV in 2024 - Users may experience the White Screen of Death (WSOD) on the Vision Zero Viewer at the beginning of the year. The WSOD was caused because it was missing a [hard-coded population estimate](https://github.com/cityofaustin/atd-vz-data/blob/master/atd-vzv/src/constants/popEsts.js) for the City from the previous year. Update the `popEsts` directly after the new year to prevent the WSOD from persisting. Reference the [PR from last year.](https://github.com/cityofaustin/atd-vz-data/pull/1148) h/t @frankhereford
non_process
fix the white screen of death wsod for the vzv in users may experience the white screen of death wsod on the vision zero viewer at the beginning of the year the wsod was caused because it was missing a for the city from the previous year update the popests directly after the new year to prevent the wsod from persisting reference the h t frankhereford
0
16,005
20,188,209,703
IssuesEvent
2022-02-11 01:18:14
savitamittalmsft/WAS-SEC-TEST
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
opened
Mitigate DDoS attacks
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Networking & Connectivity Endpoints
<a href="https://docs.microsoft.com/azure/architecture/framework/security/design-network-endpoints#mitigate-ddos-attacks">Mitigate DDoS attacks</a> <p><b>Why Consider This?</b></p> DDoS attacks can be debilitating and completely block access to, or take down, your services. The worst time to plan a DDoS strategy is while under DDoS attack. <p><b>Context</b></p> <p><span>The major cloud service providers offer DDoS protection of services of varying effectiveness and capacity. The cloud service providers typically provide two DDoS protection options:</span></p><ul style="list-style-type:disc"><li value="1" style="text-indent: 0px;"><span>DDoS protection at the cloud network fabric level - all customers of the cloud service provider benefit from these protections. The protection is usually focused at the network (layer 3) level.</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>DDoS protection at higher levels that profile your services - this kind of protection will baseline your deployments and then use machine learning techniques to detect anomalous traffic and proactively protect based on their protection before there is service degradation</span></li></ul><p><span>It's recommended to adopt advanced protection for any services where downtime will have negative impact on the business.</span></p> <p><b>Suggested Actions</b></p> <p><span>Identify critical workloads that are susceptible to DDoS attacks and enable Distributed Denial of Service (DDoS) mitigations for all business-critical web applications and services.</span></p> <p><b>Learn More</b></p> <p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/Security/network-security-containment#mitigate-ddos-attacks" target="_blank"><span>Mitigate DDoS attacks</span></a><span /></p>
1.0
Mitigate DDoS attacks - <a href="https://docs.microsoft.com/azure/architecture/framework/security/design-network-endpoints#mitigate-ddos-attacks">Mitigate DDoS attacks</a> <p><b>Why Consider This?</b></p> DDoS attacks can be debilitating and completely block access to, or take down, your services. The worst time to plan a DDoS strategy is while under DDoS attack. <p><b>Context</b></p> <p><span>The major cloud service providers offer DDoS protection of services of varying effectiveness and capacity. The cloud service providers typically provide two DDoS protection options:</span></p><ul style="list-style-type:disc"><li value="1" style="text-indent: 0px;"><span>DDoS protection at the cloud network fabric level - all customers of the cloud service provider benefit from these protections. The protection is usually focused at the network (layer 3) level.</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>DDoS protection at higher levels that profile your services - this kind of protection will baseline your deployments and then use machine learning techniques to detect anomalous traffic and proactively protect based on their protection before there is service degradation</span></li></ul><p><span>It's recommended to adopt advanced protection for any services where downtime will have negative impact on the business.</span></p> <p><b>Suggested Actions</b></p> <p><span>Identify critical workloads that are susceptible to DDoS attacks and enable Distributed Denial of Service (DDoS) mitigations for all business-critical web applications and services.</span></p> <p><b>Learn More</b></p> <p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/Security/network-security-containment#mitigate-ddos-attacks" target="_blank"><span>Mitigate DDoS attacks</span></a><span /></p>
process
mitigate ddos attacks why consider this ddos attacks can be debilitating and completely block access to or take down your services the worst time to plan a ddos strategy is while under ddos attack context the major cloud service providers offer ddos protection of services of varying effectiveness and capacity the cloud service providers typically provide two ddos protection options ddos protection at the cloud network fabric level all customers of the cloud service provider benefit from these protections the protection is usually focused at the network layer level ddos protection at higher levels that profile your services this kind of protection will baseline your deployments and then use machine learning techniques to detect anomalous traffic and proactively protect based on their protection before there is service degradation it s recommended to adopt advanced protection for any services where downtime will have negative impact on the business suggested actions identify critical workloads that are susceptible to ddos attacks and enable distributed denial of service ddos mitigations for all business critical web applications and services learn more mitigate ddos attacks
1
952
3,418,193,610
IssuesEvent
2015-12-08 00:28:12
martensonbj/traffic-spy-skeleton
https://api.github.com/repos/martensonbj/traffic-spy-skeleton
opened
processing_requests_400
processing requests user story
As a user When I send a POST request to 'http://yourapplication:port/sources/IDENTIFIER/data' And I do not send a payload Then I get a response of 'Missing Payload - 400 Bad Request'
1.0
processing_requests_400 - As a user When I send a POST request to 'http://yourapplication:port/sources/IDENTIFIER/data' And I do not send a payload Then I get a response of 'Missing Payload - 400 Bad Request'
process
processing requests as a user when i send a post request to and i do not send a payload then i get a response of missing payload bad request
1
206,706
7,120,449,087
IssuesEvent
2018-01-19 01:16:37
neuropoly/spinalcordtoolbox
https://api.github.com/repos/neuropoly/spinalcordtoolbox
closed
Verbose issue if no output name is specified
bug priority:MEDIUM sct_image
### Description When using `-setorient` or `-setorient-data`, if user does not specify output name, it will say: "created file: None" without error message, although a file is created with suffix "_ORIENTATION". ### Additional Information ~~~ sct_image -i t2_copy.nii.gz -setorient RIP -- Spinal Cord Toolbox (qt-propseg/5bcbd705c2dd5041a529cbdd45887ec81ff63583) Running /Users/julien/code/sct/scripts/sct_image.py -i t2_copy.nii.gz -setorient RIP t2_copy.nii.gz Get dimensions of data... 64 x 320 x 320 x 1 Change orientation... Generate output files... WARNING: File t2_copy_RIP.nii.gz already exists. Deleting it. Finished. To view results, type: fslview None & ~~~
1.0
Verbose issue if no output name is specified - ### Description When using `-setorient` or `-setorient-data`, if user does not specify output name, it will say: "created file: None" without error message, although a file is created with suffix "_ORIENTATION". ### Additional Information ~~~ sct_image -i t2_copy.nii.gz -setorient RIP -- Spinal Cord Toolbox (qt-propseg/5bcbd705c2dd5041a529cbdd45887ec81ff63583) Running /Users/julien/code/sct/scripts/sct_image.py -i t2_copy.nii.gz -setorient RIP t2_copy.nii.gz Get dimensions of data... 64 x 320 x 320 x 1 Change orientation... Generate output files... WARNING: File t2_copy_RIP.nii.gz already exists. Deleting it. Finished. To view results, type: fslview None & ~~~
non_process
verbose issue if no output name is specified description when using setorient or setorient data if user does not specify output name it will say created file none without error message although a file is created with suffix orientation additional information sct image i copy nii gz setorient rip spinal cord toolbox qt propseg running users julien code sct scripts sct image py i copy nii gz setorient rip copy nii gz get dimensions of data x x x change orientation generate output files warning file copy rip nii gz already exists deleting it finished to view results type fslview none
0
4,522
7,370,551,246
IssuesEvent
2018-03-13 08:53:12
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
Regular expression on destination site works wrong due to our code instrumentation
AREA: client AREA: server SYSTEM: resource processing TYPE: bug health-monitor
**Site**: http://tfile.me/. Code without hh: ```js function fnName() { let rand = "aGVsbG8="; bi.src = img; } var str = fnName.toString().match(/let\s*rand\s*\=\s*\"(.*)\"/i)[1]; // str = "aGVsbG8=" console.log(atob(str)); // "hello" ``` With hh: ```js function fnName() { let rand = "aGVsbG8="; __set$(bi,"src",img); } var str = fnName.toString().match(/let\s*rand\s*\=\s*\"(.*)\"/i)[1]; // str = "aGVsbG8="; __set$(bi,"src" console.log(atob(str)); // Uncaught DOMException: Failed to execute 'atob' on 'Window': The string to be decoded is not correctly encoded. ```
1.0
Regular expression on destination site works wrong due to our code instrumentation - **Site**: http://tfile.me/. Code without hh: ```js function fnName() { let rand = "aGVsbG8="; bi.src = img; } var str = fnName.toString().match(/let\s*rand\s*\=\s*\"(.*)\"/i)[1]; // str = "aGVsbG8=" console.log(atob(str)); // "hello" ``` With hh: ```js function fnName() { let rand = "aGVsbG8="; __set$(bi,"src",img); } var str = fnName.toString().match(/let\s*rand\s*\=\s*\"(.*)\"/i)[1]; // str = "aGVsbG8="; __set$(bi,"src" console.log(atob(str)); // Uncaught DOMException: Failed to execute 'atob' on 'Window': The string to be decoded is not correctly encoded. ```
process
regular expression on destination site works wrong due to our code instrumentation site code without hh js function fnname let rand bi src img var str fnname tostring match let s rand s s i str console log atob str hello with hh js function fnname let rand set bi src img var str fnname tostring match let s rand s s i str set bi src console log atob str uncaught domexception failed to execute atob on window the string to be decoded is not correctly encoded
1
15,298
19,318,766,558
IssuesEvent
2021-12-14 01:22:26
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
Tasks with type `process` and a missing `command` fail too early to emit the right events
bug tasks terminal-process
Issue Type: <b>Bug</b> See the [reproduction repo](https://github.com/mkhl/issue-process-task-for-missing-executable-drops-events), which provides two tasks `shell` and `process` and a README with these steps. ### Actual behavior 1. Run the task `process` 1. Run the task `process` again. 1. Note that each failed in its own terminal, despite what the message in the terminal said. 1. Focus the terminal and press a key to close it. 1. Run the task `shell`. 1. Run the task `shell` again. 1. Note that both failed in the same terminal, again not shared with the `process` task. 1. Without closing the terminal, run the task `process` again. 1. Note that the task reuses the terminal now, but doesn't fail. Instead it just produces no output at all, and doesn't seem to signal that it's finished. ### Expected behavior The `process` task doesn't hang, it just fails. Its terminal gets reused like the one from `shell`. Ideally both tasks would emit the exact same sequence events, including both `onDidEndTask` and `onDidEndTaskProcess` so the listener can determine that the task failed. VS Code version: Code - Insiders 1.63.0-insider (c42793d0357ff9c6589cce79a847177fd42852ee, 2021-11-29T08:08:44.056Z) OS version: Linux x64 5.15.4-201.fc35.x86_64 Restricted Mode: No <details> <summary>System Info</summary> |Item|Value| |---|---| |CPUs|Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz (8 x 1567)| |GPU Status|2d_canvas: enabled<br>gpu_compositing: enabled<br>multiple_raster_threads: enabled_on<br>oop_rasterization: disabled_off<br>opengl: enabled_on<br>rasterization: disabled_software<br>skia_renderer: enabled_on<br>video_decode: disabled_software<br>vulkan: disabled_off<br>webgl: enabled<br>webgl2: enabled| |Load (avg)|1, 1, 1| |Memory (System)|23.24GB (2.03GB free)| |Process Argv|--disable-extensions .| |Screen Reader|no| |VM|0%| |DESKTOP_SESSION|gnome| |XDG_CURRENT_DESKTOP|GNOME| |XDG_SESSION_DESKTOP|gnome| |XDG_SESSION_TYPE|wayland| </details>Extensions disabled<details> <summary>A/B Experiments</summary> ``` vsliv695:30137379 vsins829:30139715 vsliv368cf:30146710 vsreu685:30147344 python383:30185418 vspor879:30202332 vspor708:30202333 vspor363:30204092 pythontb:30258533 pythonptprofiler:30281269 vshan820:30294714 pythondataviewer:30285072 vscod805:30301674 pythonvspyt200:30323110 bridge0708:30335490 bridge0723:30353136 pythonrunftest32:30365365 pythonf5test824:30361779 javagetstartedt:30350119 pythonvspyt187:30365360 vsaa593:30376534 vsc1dst:30396469 pythonvs932:30404738 vscexrecpromptt2:30397559 vscop804cf:30404767 vs360:30404995 ``` </details> <!-- generated by issue reporter -->
1.0
Tasks with type `process` and a missing `command` fail too early to emit the right events - Issue Type: <b>Bug</b> See the [reproduction repo](https://github.com/mkhl/issue-process-task-for-missing-executable-drops-events), which provides two tasks `shell` and `process` and a README with these steps. ### Actual behavior 1. Run the task `process` 1. Run the task `process` again. 1. Note that each failed in its own terminal, despite what the message in the terminal said. 1. Focus the terminal and press a key to close it. 1. Run the task `shell`. 1. Run the task `shell` again. 1. Note that both failed in the same terminal, again not shared with the `process` task. 1. Without closing the terminal, run the task `process` again. 1. Note that the task reuses the terminal now, but doesn't fail. Instead it just produces no output at all, and doesn't seem to signal that it's finished. ### Expected behavior The `process` task doesn't hang, it just fails. Its terminal gets reused like the one from `shell`. Ideally both tasks would emit the exact same sequence events, including both `onDidEndTask` and `onDidEndTaskProcess` so the listener can determine that the task failed. VS Code version: Code - Insiders 1.63.0-insider (c42793d0357ff9c6589cce79a847177fd42852ee, 2021-11-29T08:08:44.056Z) OS version: Linux x64 5.15.4-201.fc35.x86_64 Restricted Mode: No <details> <summary>System Info</summary> |Item|Value| |---|---| |CPUs|Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz (8 x 1567)| |GPU Status|2d_canvas: enabled<br>gpu_compositing: enabled<br>multiple_raster_threads: enabled_on<br>oop_rasterization: disabled_off<br>opengl: enabled_on<br>rasterization: disabled_software<br>skia_renderer: enabled_on<br>video_decode: disabled_software<br>vulkan: disabled_off<br>webgl: enabled<br>webgl2: enabled| |Load (avg)|1, 1, 1| |Memory (System)|23.24GB (2.03GB free)| |Process Argv|--disable-extensions .| |Screen Reader|no| |VM|0%| |DESKTOP_SESSION|gnome| |XDG_CURRENT_DESKTOP|GNOME| |XDG_SESSION_DESKTOP|gnome| |XDG_SESSION_TYPE|wayland| </details>Extensions disabled<details> <summary>A/B Experiments</summary> ``` vsliv695:30137379 vsins829:30139715 vsliv368cf:30146710 vsreu685:30147344 python383:30185418 vspor879:30202332 vspor708:30202333 vspor363:30204092 pythontb:30258533 pythonptprofiler:30281269 vshan820:30294714 pythondataviewer:30285072 vscod805:30301674 pythonvspyt200:30323110 bridge0708:30335490 bridge0723:30353136 pythonrunftest32:30365365 pythonf5test824:30361779 javagetstartedt:30350119 pythonvspyt187:30365360 vsaa593:30376534 vsc1dst:30396469 pythonvs932:30404738 vscexrecpromptt2:30397559 vscop804cf:30404767 vs360:30404995 ``` </details> <!-- generated by issue reporter -->
process
tasks with type process and a missing command fail too early to emit the right events issue type bug see the which provides two tasks shell and process and a readme with these steps actual behavior run the task process run the task process again note that each failed in its own terminal despite what the message in the terminal said focus the terminal and press a key to close it run the task shell run the task shell again note that both failed in the same terminal again not shared with the process task without closing the terminal run the task process again note that the task reuses the terminal now but doesn t fail instead it just produces no output at all and doesn t seem to signal that it s finished expected behavior the process task doesn t hang it just fails its terminal gets reused like the one from shell ideally both tasks would emit the exact same sequence events including both ondidendtask and ondidendtaskprocess so the listener can determine that the task failed vs code version code insiders insider os version linux restricted mode no system info item value cpus intel r core tm cpu x gpu status canvas enabled gpu compositing enabled multiple raster threads enabled on oop rasterization disabled off opengl enabled on rasterization disabled software skia renderer enabled on video decode disabled software vulkan disabled off webgl enabled enabled load avg memory system free process argv disable extensions screen reader no vm desktop session gnome xdg current desktop gnome xdg session desktop gnome xdg session type wayland extensions disabled a b experiments pythontb pythonptprofiler pythondataviewer javagetstartedt
1
66,205
20,049,679,943
IssuesEvent
2022-02-03 03:49:44
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
closed
Element Web shows blank message for polls if Labs feature disabled (or if on Stable, where it doesn't exist)
T-Defect X-Regression S-Minor A-Timeline O-Uncommon A-Polls
### Steps to reproduce 1. Where are you starting? What can you see? On Element/Web stable (or on Develop with the polls feature disabled) 2. What do you click? A room a poll has been started in ### Outcome #### What did you expect? Either not seeing the poll start message at all, or some kind of UI indicating an unrecognized message type. #### What happened instead? A message that looks like a perfectly ordinary textual message, but with no text at all. In Develop, with the feature disabled: ![image](https://user-images.githubusercontent.com/99404/144201541-97484733-6158-4785-9a61-36509b1f9ed7.png) (the stable case was reported to me by a friend, at which point I repro'd in Develop) ### Operating system Exherbo Linux ### Browser information Firefox 94.0.2 ### URL for webapp develop.element.io ### Application version Element version: 090fc808bbc9-react-5c895bf3f68d-js-db9936e07c39 Olm version: 3.2.3 ### Homeserver matrix.org ### Will you send logs? No
1.0
Element Web shows blank message for polls if Labs feature disabled (or if on Stable, where it doesn't exist) - ### Steps to reproduce 1. Where are you starting? What can you see? On Element/Web stable (or on Develop with the polls feature disabled) 2. What do you click? A room a poll has been started in ### Outcome #### What did you expect? Either not seeing the poll start message at all, or some kind of UI indicating an unrecognized message type. #### What happened instead? A message that looks like a perfectly ordinary textual message, but with no text at all. In Develop, with the feature disabled: ![image](https://user-images.githubusercontent.com/99404/144201541-97484733-6158-4785-9a61-36509b1f9ed7.png) (the stable case was reported to me by a friend, at which point I repro'd in Develop) ### Operating system Exherbo Linux ### Browser information Firefox 94.0.2 ### URL for webapp develop.element.io ### Application version Element version: 090fc808bbc9-react-5c895bf3f68d-js-db9936e07c39 Olm version: 3.2.3 ### Homeserver matrix.org ### Will you send logs? No
non_process
element web shows blank message for polls if labs feature disabled or if on stable where it doesn t exist steps to reproduce where are you starting what can you see on element web stable or on develop with the polls feature disabled what do you click a room a poll has been started in outcome what did you expect either not seeing the poll start message at all or some kind of ui indicating an unrecognized message type what happened instead a message that looks like a perfectly ordinary textual message but with no text at all in develop with the feature disabled the stable case was reported to me by a friend at which point i repro d in develop operating system exherbo linux browser information firefox url for webapp develop element io application version element version react js olm version homeserver matrix org will you send logs no
0
190,838
15,256,968,985
IssuesEvent
2021-02-20 22:41:12
100Automations/Website
https://api.github.com/repos/100Automations/Website
opened
Update new automation image size
documentation role: product
### Overview We need to update the new automation image size (744 by 300) across all documentation that mentions it ### Action Items - [ ] guide for image creation - [ ] automation template ### Resources/Instructions
1.0
Update new automation image size - ### Overview We need to update the new automation image size (744 by 300) across all documentation that mentions it ### Action Items - [ ] guide for image creation - [ ] automation template ### Resources/Instructions
non_process
update new automation image size overview we need to update the new automation image size by across all documentation that mentions it action items guide for image creation automation template resources instructions
0
11,101
13,941,603,173
IssuesEvent
2020-10-22 19:40:14
googleapis/google-resumable-media-python
https://api.github.com/repos/googleapis/google-resumable-media-python
closed
Tests broken on master
testing type: process
```python _____________ ERROR collecting tests_async/unit/test__download.py ______________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/test__download.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/test__download.py:25: in <module> import google.auth.transport._aiohttp_requests as aiohttp_requests .nox/unit-3-8/lib/python3.8/site-packages/google/auth/transport/_aiohttp_requests.py:26: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' _________ ERROR collecting tests_async/unit/requests/test__helpers.py __________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test__helpers.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test__helpers.py:15: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' _________ ERROR collecting tests_async/unit/requests/test_download.py __________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test_download.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test_download.py:17: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' __________ ERROR collecting tests_async/unit/requests/test_upload.py ___________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test_upload.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test_upload.py:15: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' ```
1.0
Tests broken on master - ```python _____________ ERROR collecting tests_async/unit/test__download.py ______________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/test__download.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/test__download.py:25: in <module> import google.auth.transport._aiohttp_requests as aiohttp_requests .nox/unit-3-8/lib/python3.8/site-packages/google/auth/transport/_aiohttp_requests.py:26: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' _________ ERROR collecting tests_async/unit/requests/test__helpers.py __________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test__helpers.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test__helpers.py:15: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' _________ ERROR collecting tests_async/unit/requests/test_download.py __________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test_download.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test_download.py:17: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' __________ ERROR collecting tests_async/unit/requests/test_upload.py ___________ ImportError while importing test module '/home/tseaver/projects/agendaless/Google/src/grmp/tests_async/unit/requests/test_upload.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/Python-3.8.1/lib/python3.8/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests_async/unit/requests/test_upload.py:15: in <module> import aiohttp E ModuleNotFoundError: No module named 'aiohttp' ```
process
tests broken on master python error collecting tests async unit test download py importerror while importing test module home tseaver projects agendaless google src grmp tests async unit test download py hint make sure your test modules packages have valid python names traceback opt python lib importlib init py in import module return bootstrap gcd import name package level tests async unit test download py in import google auth transport aiohttp requests as aiohttp requests nox unit lib site packages google auth transport aiohttp requests py in import aiohttp e modulenotfounderror no module named aiohttp error collecting tests async unit requests test helpers py importerror while importing test module home tseaver projects agendaless google src grmp tests async unit requests test helpers py hint make sure your test modules packages have valid python names traceback opt python lib importlib init py in import module return bootstrap gcd import name package level tests async unit requests test helpers py in import aiohttp e modulenotfounderror no module named aiohttp error collecting tests async unit requests test download py importerror while importing test module home tseaver projects agendaless google src grmp tests async unit requests test download py hint make sure your test modules packages have valid python names traceback opt python lib importlib init py in import module return bootstrap gcd import name package level tests async unit requests test download py in import aiohttp e modulenotfounderror no module named aiohttp error collecting tests async unit requests test upload py importerror while importing test module home tseaver projects agendaless google src grmp tests async unit requests test upload py hint make sure your test modules packages have valid python names traceback opt python lib importlib init py in import module return bootstrap gcd import name package level tests async unit requests test upload py in import aiohttp e modulenotfounderror no module named aiohttp
1
157,036
12,343,678,301
IssuesEvent
2020-05-15 04:51:43
gluster/glusterfs
https://api.github.com/repos/gluster/glusterfs
closed
./tests/features/ssl-ciphers.t test case failing on opensuse:15
FA: Testing Improvements Type:Bug wontfix
Problem Description: When running the test case using the command `./run-tests.sh prove -vf ./tests/features/ssl-ciphers.t`, 2 subtests fail as follows:- ``` ========================= TEST 23 (line 107): N openssl_connect -tls1 -connect $H0:$BRICK_PORT not ok 23 Got "Y" instead of "N", LINENUM:107 RESULT 23: 1 ========================= ========================= TEST 57 (line 168): N openssl_connect -cipher EECDH -connect $H0:$BRICK_PORT not ok 57 Got "Y" instead of "N", LINENUM:168 RESULT 57: 1 ========================= Test Summary Report ------------------- ./tests/features/ssl-ciphers.t (Wstat: 0 Tests: 78 Failed: 2) Failed tests: 23, 57 Files=1, Tests=78, 43 wallclock secs ( 0.11 usr 0.01 sys + 13.21 cusr 2.32 csys = 15.65 CPU) Result: FAIL ``` ``` openssl version OpenSSL 1.1.0i-fips 14 Aug 2018 ``` Environment: OS (e.g. from /etc/os-release): ``` cat /etc/os-release NAME="openSUSE Leap" VERSION="15.0" ID="opensuse-leap" ID_LIKE="suse opensuse" VERSION_ID="15.0" PRETTY_NAME="openSUSE Leap 15.0" ANSI_COLOR="0;32" CPE_NAME="cpe:/o:opensuse:leap:15.0" BUG_REPORT_URL="https://bugs.opensuse.org" HOME_URL="https://www.opensuse.org/" ``` Kernel (e.g. uname -a): **x86_64 G NU/Linux** Glusterfs version: v4.1.5 PFA the logs for glusterd. [glusterd.log](https://github.com/gluster/glusterfs/files/2537421/glusterd.log)
1.0
./tests/features/ssl-ciphers.t test case failing on opensuse:15 - Problem Description: When running the test case using the command `./run-tests.sh prove -vf ./tests/features/ssl-ciphers.t`, 2 subtests fail as follows:- ``` ========================= TEST 23 (line 107): N openssl_connect -tls1 -connect $H0:$BRICK_PORT not ok 23 Got "Y" instead of "N", LINENUM:107 RESULT 23: 1 ========================= ========================= TEST 57 (line 168): N openssl_connect -cipher EECDH -connect $H0:$BRICK_PORT not ok 57 Got "Y" instead of "N", LINENUM:168 RESULT 57: 1 ========================= Test Summary Report ------------------- ./tests/features/ssl-ciphers.t (Wstat: 0 Tests: 78 Failed: 2) Failed tests: 23, 57 Files=1, Tests=78, 43 wallclock secs ( 0.11 usr 0.01 sys + 13.21 cusr 2.32 csys = 15.65 CPU) Result: FAIL ``` ``` openssl version OpenSSL 1.1.0i-fips 14 Aug 2018 ``` Environment: OS (e.g. from /etc/os-release): ``` cat /etc/os-release NAME="openSUSE Leap" VERSION="15.0" ID="opensuse-leap" ID_LIKE="suse opensuse" VERSION_ID="15.0" PRETTY_NAME="openSUSE Leap 15.0" ANSI_COLOR="0;32" CPE_NAME="cpe:/o:opensuse:leap:15.0" BUG_REPORT_URL="https://bugs.opensuse.org" HOME_URL="https://www.opensuse.org/" ``` Kernel (e.g. uname -a): **x86_64 G NU/Linux** Glusterfs version: v4.1.5 PFA the logs for glusterd. [glusterd.log](https://github.com/gluster/glusterfs/files/2537421/glusterd.log)
non_process
tests features ssl ciphers t test case failing on opensuse problem description when running the test case using the command run tests sh prove vf tests features ssl ciphers t subtests fail as follows test line n openssl connect connect brick port not ok got y instead of n linenum result test line n openssl connect cipher eecdh connect brick port not ok got y instead of n linenum result test summary report tests features ssl ciphers t wstat tests failed failed tests files tests wallclock secs usr sys cusr csys cpu result fail openssl version openssl fips aug environment os e g from etc os release cat etc os release name opensuse leap version id opensuse leap id like suse opensuse version id pretty name opensuse leap ansi color cpe name cpe o opensuse leap bug report url home url kernel e g uname a g nu linux glusterfs version pfa the logs for glusterd
0
15,606
19,728,395,208
IssuesEvent
2022-01-13 22:34:02
googleapis/python-automl
https://api.github.com/repos/googleapis/python-automl
reopened
tests.system.gapic.v1beta1.test_system_tables_client_v1.TestSystemTablesClient: test_import_data failed
type: process priority: p1 api: automl flakybot: issue flakybot: flaky
Note: #183 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: a0f05b163013d4c4c8b2860882b16cf815c32188 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3f928980-15be-4c11-9df8-eb839c280094), [Sponge](http://sponge2/3f928980-15be-4c11-9df8-eb839c280094) status: failed <details><summary>Test output</summary><br><pre>self = <test_system_tables_client_v1.TestSystemTablesClient object at 0x7f8bd5e0e970> @vpcsc_config.skip_if_inside_vpcsc def test_import_data(self): client = automl_v1beta1.TablesClient(project=PROJECT, region=REGION) display_name = _id("t_import") dataset = client.create_dataset(display_name) op = client.import_data( dataset=dataset, gcs_input_uris="gs://cloud-ml-tables-data/bank-marketing.csv", ) > self.cancel_and_wait(op) tests/system/gapic/v1beta1/test_system_tables_client_v1.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <test_system_tables_client_v1.TestSystemTablesClient object at 0x7f8bd5e0e970> op = <google.api_core.operation.Operation object at 0x7f8bd5e47a00> def cancel_and_wait(self, op): op.cancel() start = time.time() sleep_time = 1 while time.time() - start < MAX_WAIT_TIME_SECONDS: if op.cancelled(): return time.sleep(sleep_time) sleep_time = min(sleep_time * 2, MAX_SLEEP_TIME_SECONDS) > assert op.cancelled() E assert False E + where False = <bound method Operation.cancelled of <google.api_core.operation.Operation object at 0x7f8bd5e47a00>>() E + where <bound method Operation.cancelled of <google.api_core.operation.Operation object at 0x7f8bd5e47a00>> = <google.api_core.operation.Operation object at 0x7f8bd5e47a00>.cancelled tests/system/gapic/v1beta1/test_system_tables_client_v1.py:59: AssertionError</pre></details>
1.0
tests.system.gapic.v1beta1.test_system_tables_client_v1.TestSystemTablesClient: test_import_data failed - Note: #183 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: a0f05b163013d4c4c8b2860882b16cf815c32188 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3f928980-15be-4c11-9df8-eb839c280094), [Sponge](http://sponge2/3f928980-15be-4c11-9df8-eb839c280094) status: failed <details><summary>Test output</summary><br><pre>self = <test_system_tables_client_v1.TestSystemTablesClient object at 0x7f8bd5e0e970> @vpcsc_config.skip_if_inside_vpcsc def test_import_data(self): client = automl_v1beta1.TablesClient(project=PROJECT, region=REGION) display_name = _id("t_import") dataset = client.create_dataset(display_name) op = client.import_data( dataset=dataset, gcs_input_uris="gs://cloud-ml-tables-data/bank-marketing.csv", ) > self.cancel_and_wait(op) tests/system/gapic/v1beta1/test_system_tables_client_v1.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <test_system_tables_client_v1.TestSystemTablesClient object at 0x7f8bd5e0e970> op = <google.api_core.operation.Operation object at 0x7f8bd5e47a00> def cancel_and_wait(self, op): op.cancel() start = time.time() sleep_time = 1 while time.time() - start < MAX_WAIT_TIME_SECONDS: if op.cancelled(): return time.sleep(sleep_time) sleep_time = min(sleep_time * 2, MAX_SLEEP_TIME_SECONDS) > assert op.cancelled() E assert False E + where False = <bound method Operation.cancelled of <google.api_core.operation.Operation object at 0x7f8bd5e47a00>>() E + where <bound method Operation.cancelled of <google.api_core.operation.Operation object at 0x7f8bd5e47a00>> = <google.api_core.operation.Operation object at 0x7f8bd5e47a00>.cancelled tests/system/gapic/v1beta1/test_system_tables_client_v1.py:59: AssertionError</pre></details>
process
tests system gapic test system tables client testsystemtablesclient test import data failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output self vpcsc config skip if inside vpcsc def test import data self client automl tablesclient project project region region display name id t import dataset client create dataset display name op client import data dataset dataset gcs input uris gs cloud ml tables data bank marketing csv self cancel and wait op tests system gapic test system tables client py self op def cancel and wait self op op cancel start time time sleep time while time time start max wait time seconds if op cancelled return time sleep sleep time sleep time min sleep time max sleep time seconds assert op cancelled e assert false e where false e where cancelled tests system gapic test system tables client py assertionerror
1
15,500
19,703,262,995
IssuesEvent
2022-01-12 18:52:04
googleapis/google-cloud-ruby
https://api.github.com/repos/googleapis/google-cloud-ruby
opened
Your .repo-metadata.json files have a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json files: Result of scan 📈: * must have required property 'library_type' in gcloud/.repo-metadata.json * must have required property 'release_level' in gcloud/.repo-metadata.json * must have required property 'release_level' in google-analytics-admin-v1alpha/.repo-metadata.json * api_shortname field missing from google-analytics-admin-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-analytics-admin/.repo-metadata.json * api_shortname field missing from google-analytics-admin/.repo-metadata.json * must have required property 'release_level' in google-analytics-data-v1alpha/.repo-metadata.json * api_shortname field missing from google-analytics-data-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-analytics-data-v1beta/.repo-metadata.json * api_shortname field missing from google-analytics-data-v1beta/.repo-metadata.json * must have required property 'release_level' in google-analytics-data/.repo-metadata.json * api_shortname field missing from google-analytics-data/.repo-metadata.json * must have required property 'release_level' in google-area120-tables-v1alpha1/.repo-metadata.json * api_shortname field missing from google-area120-tables-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-area120-tables/.repo-metadata.json * api_shortname field missing from google-area120-tables/.repo-metadata.json * must have required property 'release_level' in google-cloud-access_approval-v1/.repo-metadata.json * api_shortname field missing from google-cloud-access_approval-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-access_approval/.repo-metadata.json * api_shortname field missing from google-cloud-access_approval/.repo-metadata.json * must have required property 'release_level' in google-cloud-api_gateway-v1/.repo-metadata.json * api_shortname field missing from google-cloud-api_gateway-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-api_gateway/.repo-metadata.json * api_shortname field missing from google-cloud-api_gateway/.repo-metadata.json * must have required property 'release_level' in google-cloud-apigee_connect-v1/.repo-metadata.json * api_shortname field missing from google-cloud-apigee_connect-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-apigee_connect/.repo-metadata.json * api_shortname field missing from google-cloud-apigee_connect/.repo-metadata.json * must have required property 'release_level' in google-cloud-app_engine-v1/.repo-metadata.json * api_shortname field missing from google-cloud-app_engine-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-app_engine/.repo-metadata.json * api_shortname field missing from google-cloud-app_engine/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry-v1/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset-v1/.repo-metadata.json * api_shortname field missing from google-cloud-asset-v1/.repo-metadata.json * must have required property 'library_type' in google-cloud-asset-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset/.repo-metadata.json * api_shortname field missing from google-cloud-asset/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads-v1/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl-v1/.repo-metadata.json * api_shortname field missing from google-cloud-automl-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-automl-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl/.repo-metadata.json * api_shortname field missing from google-cloud-automl/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-connection-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-connection-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-connection/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-connection/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-data_transfer-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-data_transfer-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-data_transfer/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-data_transfer/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-reservation-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-reservation-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-reservation/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-reservation/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-storage-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-storage-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-storage/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-storage/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable-admin-v2/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable-admin-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable-v2/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets-v1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-v1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing/.repo-metadata.json * api_shortname field missing from google-cloud-billing/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization-v1/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization/.repo-metadata.json * must have required property 'release_level' in google-cloud-build-v1/.repo-metadata.json * api_shortname field missing from google-cloud-build-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-build/.repo-metadata.json * api_shortname field missing from google-cloud-build/.repo-metadata.json * must have required property 'release_level' in google-cloud-channel-v1/.repo-metadata.json * api_shortname field missing from google-cloud-channel-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-channel/.repo-metadata.json * api_shortname field missing from google-cloud-channel/.repo-metadata.json * must have required property 'release_level' in google-cloud-cloud_dms-v1/.repo-metadata.json * api_shortname field missing from google-cloud-cloud_dms-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-cloud_dms/.repo-metadata.json * api_shortname field missing from google-cloud-cloud_dms/.repo-metadata.json * must have required property 'release_level' in google-cloud-compute-v1/.repo-metadata.json * api_shortname field missing from google-cloud-compute-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-contact_center_insights-v1/.repo-metadata.json * api_shortname field missing from google-cloud-contact_center_insights-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-contact_center_insights/.repo-metadata.json * api_shortname field missing from google-cloud-contact_center_insights/.repo-metadata.json * must have required property 'release_level' in google-cloud-container-v1/.repo-metadata.json * api_shortname field missing from google-cloud-container-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-container-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container/.repo-metadata.json * api_shortname field missing from google-cloud-container/.repo-metadata.json * must have required property 'release_level' in google-cloud-container_analysis-v1/.repo-metadata.json * api_shortname field missing from google-cloud-container_analysis-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container_analysis/.repo-metadata.json * api_shortname field missing from google-cloud-container_analysis/.repo-metadata.json * must have required property 'release_level' in google-cloud-core/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_catalog-v1/.repo-metadata.json * api_shortname field missing from google-cloud-data_catalog-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_catalog/.repo-metadata.json * api_shortname field missing from google-cloud-data_catalog/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_fusion-v1/.repo-metadata.json * api_shortname field missing from google-cloud-data_fusion-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_fusion/.repo-metadata.json * api_shortname field missing from google-cloud-data_fusion/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_labeling-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-data_labeling-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_labeling/.repo-metadata.json * api_shortname field missing from google-cloud-data_labeling/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataflow-v1beta3/.repo-metadata.json * api_shortname field missing from google-cloud-dataflow-v1beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataflow/.repo-metadata.json * api_shortname field missing from google-cloud-dataflow/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc-v1/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataqna-v1alpha/.repo-metadata.json * api_shortname field missing from google-cloud-dataqna-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataqna/.repo-metadata.json * api_shortname field missing from google-cloud-dataqna/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore-admin-v1/.repo-metadata.json * api_shortname field missing from google-cloud-datastore-admin-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-datastore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore/.repo-metadata.json * api_shortname field missing from google-cloud-datastore/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastream-v1alpha1/.repo-metadata.json * api_shortname field missing from google-cloud-datastream-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastream/.repo-metadata.json * api_shortname field missing from google-cloud-datastream/.repo-metadata.json * must have required property 'release_level' in google-cloud-debugger-v2/.repo-metadata.json * api_shortname field missing from google-cloud-debugger-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-debugger/.repo-metadata.json * must have required property 'release_level' in google-cloud-deploy-v1/.repo-metadata.json * api_shortname field missing from google-cloud-deploy-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-deploy/.repo-metadata.json * api_shortname field missing from google-cloud-deploy/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-cx-v3/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-cx-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-cx/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-cx/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-v2/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow/.repo-metadata.json * must have required property 'release_level' in google-cloud-dlp-v2/.repo-metadata.json * api_shortname field missing from google-cloud-dlp-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dlp/.repo-metadata.json * api_shortname field missing from google-cloud-dlp/.repo-metadata.json * must have required property 'release_level' in google-cloud-dns/.repo-metadata.json * api_shortname field missing from google-cloud-dns/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai-v1/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai-v1beta3/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai-v1beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai/.repo-metadata.json * must have required property 'release_level' in google-cloud-domains-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-domains-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-domains/.repo-metadata.json * api_shortname field missing from google-cloud-domains/.repo-metadata.json * must have required property 'release_level' in google-cloud-error_reporting-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-error_reporting-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-error_reporting/.repo-metadata.json * must have required property 'release_level' in google-cloud-errors/.repo-metadata.json * must have required property 'release_level' in google-cloud-essential_contacts-v1/.repo-metadata.json * api_shortname field missing from google-cloud-essential_contacts-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-essential_contacts/.repo-metadata.json * api_shortname field missing from google-cloud-essential_contacts/.repo-metadata.json * must have required property 'release_level' in google-cloud-eventarc-v1/.repo-metadata.json * api_shortname field missing from google-cloud-eventarc-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-eventarc/.repo-metadata.json * api_shortname field missing from google-cloud-eventarc/.repo-metadata.json * must have required property 'release_level' in google-cloud-filestore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-filestore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-filestore/.repo-metadata.json * api_shortname field missing from google-cloud-filestore/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore-admin-v1/.repo-metadata.json * api_shortname field missing from google-cloud-firestore-admin-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-firestore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore/.repo-metadata.json * api_shortname field missing from google-cloud-firestore/.repo-metadata.json * must have required property 'release_level' in google-cloud-functions-v1/.repo-metadata.json * api_shortname field missing from google-cloud-functions-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-functions/.repo-metadata.json * api_shortname field missing from google-cloud-functions/.repo-metadata.json * must have required property 'release_level' in google-cloud-gaming-v1/.repo-metadata.json * api_shortname field missing from google-cloud-gaming-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gaming/.repo-metadata.json * api_shortname field missing from google-cloud-gaming/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_connect-gateway/.repo-metadata.json * api_shortname field missing from google-cloud-gke_connect-gateway/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub-v1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub/.repo-metadata.json * must have required property 'release_level' in google-cloud-iap-v1/.repo-metadata.json * api_shortname field missing from google-cloud-iap-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-iap/.repo-metadata.json * api_shortname field missing from google-cloud-iap/.repo-metadata.json * must have required property 'release_level' in google-cloud-ids-v1/.repo-metadata.json * api_shortname field missing from google-cloud-ids-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-ids/.repo-metadata.json * api_shortname field missing from google-cloud-ids/.repo-metadata.json * must have required property 'release_level' in google-cloud-iot-v1/.repo-metadata.json * api_shortname field missing from google-cloud-iot-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-iot/.repo-metadata.json * api_shortname field missing from google-cloud-iot/.repo-metadata.json * must have required property 'release_level' in google-cloud-kms-v1/.repo-metadata.json * api_shortname field missing from google-cloud-kms-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-kms/.repo-metadata.json * api_shortname field missing from google-cloud-kms/.repo-metadata.json * must have required property 'release_level' in google-cloud-language-v1/.repo-metadata.json * api_shortname field missing from google-cloud-language-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-language-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-language-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-language/.repo-metadata.json * api_shortname field missing from google-cloud-language/.repo-metadata.json * must have required property 'release_level' in google-cloud-life_sciences-v2beta/.repo-metadata.json * api_shortname field missing from google-cloud-life_sciences-v2beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-life_sciences/.repo-metadata.json * api_shortname field missing from google-cloud-life_sciences/.repo-metadata.json * must have required property 'release_level' in google-cloud-location/.repo-metadata.json * api_shortname field missing from google-cloud-location/.repo-metadata.json * must have required property 'release_level' in google-cloud-logging-v2/.repo-metadata.json * api_shortname field missing from google-cloud-logging-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-logging/.repo-metadata.json * api_shortname field missing from google-cloud-logging/.repo-metadata.json * must have required property 'release_level' in google-cloud-managed_identities-v1/.repo-metadata.json * api_shortname field missing from google-cloud-managed_identities-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-managed_identities/.repo-metadata.json * api_shortname field missing from google-cloud-managed_identities/.repo-metadata.json * must have required property 'release_level' in google-cloud-media_translation-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-media_translation-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-media_translation/.repo-metadata.json * api_shortname field missing from google-cloud-media_translation/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache-v1/.repo-metadata.json * api_shortname field missing from google-cloud-memcache-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-memcache-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache/.repo-metadata.json * api_shortname field missing from google-cloud-memcache/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-metastore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-metastore-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore/.repo-metadata.json * api_shortname field missing from google-cloud-metastore/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-dashboard-v1/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-dashboard-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-v3/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity-v1/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity-v1alpha1/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_management-v1/.repo-metadata.json * api_shortname field missing from google-cloud-network_management-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_management/.repo-metadata.json * api_shortname field missing from google-cloud-network_management/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_security-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-network_security-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_security/.repo-metadata.json * api_shortname field missing from google-cloud-network_security/.repo-metadata.json * must have required property 'release_level' in google-cloud-notebooks-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-notebooks-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-notebooks/.repo-metadata.json * api_shortname field missing from google-cloud-notebooks/.repo-metadata.json * must have required property 'release_level' in google-cloud-orchestration-airflow-service-v1/.repo-metadata.json * api_shortname field missing from google-cloud-orchestration-airflow-service-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-orchestration-airflow-service/.repo-metadata.json * api_shortname field missing from google-cloud-orchestration-airflow-service/.repo-metadata.json * must have required property 'release_level' in google-cloud-org_policy-v2/.repo-metadata.json * api_shortname field missing from google-cloud-org_policy-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-org_policy/.repo-metadata.json * api_shortname field missing from google-cloud-org_policy/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config-v1/.repo-metadata.json * api_shortname field missing from google-cloud-os_config-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config-v1alpha/.repo-metadata.json * api_shortname field missing from google-cloud-os_config-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config/.repo-metadata.json * api_shortname field missing from google-cloud-os_config/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login-v1/.repo-metadata.json * api_shortname field missing from google-cloud-os_login-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-os_login-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login/.repo-metadata.json * api_shortname field missing from google-cloud-os_login/.repo-metadata.json * must have required property 'release_level' in google-cloud-phishing_protection-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-phishing_protection-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-phishing_protection/.repo-metadata.json * api_shortname field missing from google-cloud-phishing_protection/.repo-metadata.json * must have required property 'release_level' in google-cloud-policy_troubleshooter-v1/.repo-metadata.json * api_shortname field missing from google-cloud-policy_troubleshooter-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-policy_troubleshooter/.repo-metadata.json * api_shortname field missing from google-cloud-policy_troubleshooter/.repo-metadata.json * must have required property 'release_level' in google-cloud-private_catalog-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-private_catalog-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-private_catalog/.repo-metadata.json * api_shortname field missing from google-cloud-private_catalog/.repo-metadata.json * must have required property 'release_level' in google-cloud-profiler-v2/.repo-metadata.json * api_shortname field missing from google-cloud-profiler-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-profiler/.repo-metadata.json * api_shortname field missing from google-cloud-profiler/.repo-metadata.json * must have required property 'release_level' in google-cloud-pubsub-v1/.repo-metadata.json * api_shortname field missing from google-cloud-pubsub-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-pubsub/.repo-metadata.json * api_shortname field missing from google-cloud-pubsub/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommendation_engine-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-recommendation_engine-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommendation_engine/.repo-metadata.json * api_shortname field missing from google-cloud-recommendation_engine/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommender-v1/.repo-metadata.json * api_shortname field missing from google-cloud-recommender-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommender/.repo-metadata.json * api_shortname field missing from google-cloud-recommender/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis-v1/.repo-metadata.json * api_shortname field missing from google-cloud-redis-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-redis-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis/.repo-metadata.json * api_shortname field missing from google-cloud-redis/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_manager-v3/.repo-metadata.json * api_shortname field missing from google-cloud-resource_manager-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_manager/.repo-metadata.json * api_shortname field missing from google-cloud-resource_manager/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_settings-v1/.repo-metadata.json * api_shortname field missing from google-cloud-resource_settings-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_settings/.repo-metadata.json * api_shortname field missing from google-cloud-resource_settings/.repo-metadata.json * must have required property 'release_level' in google-cloud-retail-v2/.repo-metadata.json * api_shortname field missing from google-cloud-retail-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-retail/.repo-metadata.json * api_shortname field missing from google-cloud-retail/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler-v1/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager-v1/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca-v1/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center-v1/.repo-metadata.json * api_shortname field missing from google-cloud-security_center-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-security_center-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center/.repo-metadata.json * api_shortname field missing from google-cloud-security_center/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_control-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_control-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_control/.repo-metadata.json * api_shortname field missing from google-cloud-service_control/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_management-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_management-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_management/.repo-metadata.json * api_shortname field missing from google-cloud-service_management/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_usage-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_usage-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_usage/.repo-metadata.json * api_shortname field missing from google-cloud-service_usage/.repo-metadata.json * must have required property 'release_level' in google-cloud-shell-v1/.repo-metadata.json * api_shortname field missing from google-cloud-shell-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-shell/.repo-metadata.json * api_shortname field missing from google-cloud-shell/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-admin-database-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-admin-database-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-admin-instance-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-admin-instance-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner/.repo-metadata.json * api_shortname field missing from google-cloud-spanner/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech-v1/.repo-metadata.json * api_shortname field missing from google-cloud-speech-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-speech-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech/.repo-metadata.json * api_shortname field missing from google-cloud-speech/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage/.repo-metadata.json * api_shortname field missing from google-cloud-storage/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage_transfer-v1/.repo-metadata.json * api_shortname field missing from google-cloud-storage_transfer-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage_transfer/.repo-metadata.json * api_shortname field missing from google-cloud-storage_transfer/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent-v4/.repo-metadata.json * api_shortname field missing from google-cloud-talent-v4/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent-v4beta1/.repo-metadata.json * api_shortname field missing from google-cloud-talent-v4beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent/.repo-metadata.json * api_shortname field missing from google-cloud-talent/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2beta2/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2beta3/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks/.repo-metadata.json * api_shortname field missing from google-cloud-tasks/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech-v1/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech/.repo-metadata.json * must have required property 'release_level' in google-cloud-tpu-v1/.repo-metadata.json * api_shortname field missing from google-cloud-tpu-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-tpu/.repo-metadata.json * api_shortname field missing from google-cloud-tpu/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace-v1/.repo-metadata.json * api_shortname field missing from google-cloud-trace-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace-v2/.repo-metadata.json * api_shortname field missing from google-cloud-trace-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate-v2/.repo-metadata.json * api_shortname field missing from google-cloud-translate-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate-v3/.repo-metadata.json * api_shortname field missing from google-cloud-translate-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate/.repo-metadata.json * api_shortname field missing from google-cloud-translate/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder-v1/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1p3beta1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1p3beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1p4beta1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1p4beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision/.repo-metadata.json * api_shortname field missing from google-cloud-vision/.repo-metadata.json * must have required property 'release_level' in google-cloud-vm_migration-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vm_migration-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vm_migration/.repo-metadata.json * api_shortname field missing from google-cloud-vm_migration/.repo-metadata.json * must have required property 'release_level' in google-cloud-vpc_access-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vpc_access-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vpc_access/.repo-metadata.json * api_shortname field missing from google-cloud-vpc_access/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk-v1/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner-v1/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner/.repo-metadata.json * must have required property 'release_level' in google-cloud-webrisk/.repo-metadata.json * api_shortname field missing from google-cloud-webrisk/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-executions-v1/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-executions-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-executions-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-executions-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-v1/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows/.repo-metadata.json * api_shortname field missing from google-cloud-workflows/.repo-metadata.json * must have required property 'library_type' in google-cloud/.repo-metadata.json * must have required property 'release_level' in google-cloud/.repo-metadata.json * must have required property 'release_level' in google-iam-credentials-v1/.repo-metadata.json * api_shortname field missing from google-iam-credentials-v1/.repo-metadata.json * must have required property 'release_level' in google-iam-credentials/.repo-metadata.json * api_shortname field missing from google-iam-credentials/.repo-metadata.json * must have required property 'release_level' in google-iam-v1beta/.repo-metadata.json * api_shortname field missing from google-iam-v1beta/.repo-metadata.json * must have required property 'release_level' in google-identity-access_context_manager-v1/.repo-metadata.json * api_shortname field missing from google-identity-access_context_manager-v1/.repo-metadata.json * must have required property 'release_level' in google-identity-access_context_manager/.repo-metadata.json * api_shortname field missing from google-identity-access_context_manager/.repo-metadata.json * must have required property 'library_type' in grafeas-client/.repo-metadata.json * must have required property 'release_level' in grafeas-client/.repo-metadata.json * must have required property 'release_level' in grafeas-v1/.repo-metadata.json * api_shortname field missing from grafeas-v1/.repo-metadata.json * must have required property 'release_level' in grafeas/.repo-metadata.json * api_shortname field missing from grafeas/.repo-metadata.json * must have required property 'library_type' in stackdriver-core/.repo-metadata.json * must have required property 'release_level' in stackdriver-core/.repo-metadata.json * must have required property 'library_type' in stackdriver/.repo-metadata.json * must have required property 'release_level' in stackdriver/.repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files: Result of scan 📈: * must have required property 'library_type' in gcloud/.repo-metadata.json * must have required property 'release_level' in gcloud/.repo-metadata.json * must have required property 'release_level' in google-analytics-admin-v1alpha/.repo-metadata.json * api_shortname field missing from google-analytics-admin-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-analytics-admin/.repo-metadata.json * api_shortname field missing from google-analytics-admin/.repo-metadata.json * must have required property 'release_level' in google-analytics-data-v1alpha/.repo-metadata.json * api_shortname field missing from google-analytics-data-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-analytics-data-v1beta/.repo-metadata.json * api_shortname field missing from google-analytics-data-v1beta/.repo-metadata.json * must have required property 'release_level' in google-analytics-data/.repo-metadata.json * api_shortname field missing from google-analytics-data/.repo-metadata.json * must have required property 'release_level' in google-area120-tables-v1alpha1/.repo-metadata.json * api_shortname field missing from google-area120-tables-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-area120-tables/.repo-metadata.json * api_shortname field missing from google-area120-tables/.repo-metadata.json * must have required property 'release_level' in google-cloud-access_approval-v1/.repo-metadata.json * api_shortname field missing from google-cloud-access_approval-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-access_approval/.repo-metadata.json * api_shortname field missing from google-cloud-access_approval/.repo-metadata.json * must have required property 'release_level' in google-cloud-api_gateway-v1/.repo-metadata.json * api_shortname field missing from google-cloud-api_gateway-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-api_gateway/.repo-metadata.json * api_shortname field missing from google-cloud-api_gateway/.repo-metadata.json * must have required property 'release_level' in google-cloud-apigee_connect-v1/.repo-metadata.json * api_shortname field missing from google-cloud-apigee_connect-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-apigee_connect/.repo-metadata.json * api_shortname field missing from google-cloud-apigee_connect/.repo-metadata.json * must have required property 'release_level' in google-cloud-app_engine-v1/.repo-metadata.json * api_shortname field missing from google-cloud-app_engine-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-app_engine/.repo-metadata.json * api_shortname field missing from google-cloud-app_engine/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry-v1/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-artifact_registry/.repo-metadata.json * api_shortname field missing from google-cloud-artifact_registry/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset-v1/.repo-metadata.json * api_shortname field missing from google-cloud-asset-v1/.repo-metadata.json * must have required property 'library_type' in google-cloud-asset-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-asset/.repo-metadata.json * api_shortname field missing from google-cloud-asset/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads-v1/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-assured_workloads/.repo-metadata.json * api_shortname field missing from google-cloud-assured_workloads/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl-v1/.repo-metadata.json * api_shortname field missing from google-cloud-automl-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-automl-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-automl/.repo-metadata.json * api_shortname field missing from google-cloud-automl/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-connection-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-connection-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-connection/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-connection/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-data_transfer-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-data_transfer-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-data_transfer/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-data_transfer/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-reservation-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-reservation-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-reservation/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-reservation/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-storage-v1/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-storage-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery-storage/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery-storage/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigquery/.repo-metadata.json * api_shortname field missing from google-cloud-bigquery/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable-admin-v2/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable-admin-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable-v2/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-bigtable/.repo-metadata.json * api_shortname field missing from google-cloud-bigtable/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets-v1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-budgets/.repo-metadata.json * api_shortname field missing from google-cloud-billing-budgets/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing-v1/.repo-metadata.json * api_shortname field missing from google-cloud-billing-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-billing/.repo-metadata.json * api_shortname field missing from google-cloud-billing/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization-v1/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-binary_authorization/.repo-metadata.json * api_shortname field missing from google-cloud-binary_authorization/.repo-metadata.json * must have required property 'release_level' in google-cloud-build-v1/.repo-metadata.json * api_shortname field missing from google-cloud-build-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-build/.repo-metadata.json * api_shortname field missing from google-cloud-build/.repo-metadata.json * must have required property 'release_level' in google-cloud-channel-v1/.repo-metadata.json * api_shortname field missing from google-cloud-channel-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-channel/.repo-metadata.json * api_shortname field missing from google-cloud-channel/.repo-metadata.json * must have required property 'release_level' in google-cloud-cloud_dms-v1/.repo-metadata.json * api_shortname field missing from google-cloud-cloud_dms-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-cloud_dms/.repo-metadata.json * api_shortname field missing from google-cloud-cloud_dms/.repo-metadata.json * must have required property 'release_level' in google-cloud-compute-v1/.repo-metadata.json * api_shortname field missing from google-cloud-compute-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-contact_center_insights-v1/.repo-metadata.json * api_shortname field missing from google-cloud-contact_center_insights-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-contact_center_insights/.repo-metadata.json * api_shortname field missing from google-cloud-contact_center_insights/.repo-metadata.json * must have required property 'release_level' in google-cloud-container-v1/.repo-metadata.json * api_shortname field missing from google-cloud-container-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-container-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container/.repo-metadata.json * api_shortname field missing from google-cloud-container/.repo-metadata.json * must have required property 'release_level' in google-cloud-container_analysis-v1/.repo-metadata.json * api_shortname field missing from google-cloud-container_analysis-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-container_analysis/.repo-metadata.json * api_shortname field missing from google-cloud-container_analysis/.repo-metadata.json * must have required property 'release_level' in google-cloud-core/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_catalog-v1/.repo-metadata.json * api_shortname field missing from google-cloud-data_catalog-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_catalog/.repo-metadata.json * api_shortname field missing from google-cloud-data_catalog/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_fusion-v1/.repo-metadata.json * api_shortname field missing from google-cloud-data_fusion-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_fusion/.repo-metadata.json * api_shortname field missing from google-cloud-data_fusion/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_labeling-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-data_labeling-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-data_labeling/.repo-metadata.json * api_shortname field missing from google-cloud-data_labeling/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataflow-v1beta3/.repo-metadata.json * api_shortname field missing from google-cloud-dataflow-v1beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataflow/.repo-metadata.json * api_shortname field missing from google-cloud-dataflow/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc-v1/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataproc/.repo-metadata.json * api_shortname field missing from google-cloud-dataproc/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataqna-v1alpha/.repo-metadata.json * api_shortname field missing from google-cloud-dataqna-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-cloud-dataqna/.repo-metadata.json * api_shortname field missing from google-cloud-dataqna/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore-admin-v1/.repo-metadata.json * api_shortname field missing from google-cloud-datastore-admin-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-datastore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastore/.repo-metadata.json * api_shortname field missing from google-cloud-datastore/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastream-v1alpha1/.repo-metadata.json * api_shortname field missing from google-cloud-datastream-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-cloud-datastream/.repo-metadata.json * api_shortname field missing from google-cloud-datastream/.repo-metadata.json * must have required property 'release_level' in google-cloud-debugger-v2/.repo-metadata.json * api_shortname field missing from google-cloud-debugger-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-debugger/.repo-metadata.json * must have required property 'release_level' in google-cloud-deploy-v1/.repo-metadata.json * api_shortname field missing from google-cloud-deploy-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-deploy/.repo-metadata.json * api_shortname field missing from google-cloud-deploy/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-cx-v3/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-cx-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-cx/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-cx/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow-v2/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dialogflow/.repo-metadata.json * api_shortname field missing from google-cloud-dialogflow/.repo-metadata.json * must have required property 'release_level' in google-cloud-dlp-v2/.repo-metadata.json * api_shortname field missing from google-cloud-dlp-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-dlp/.repo-metadata.json * api_shortname field missing from google-cloud-dlp/.repo-metadata.json * must have required property 'release_level' in google-cloud-dns/.repo-metadata.json * api_shortname field missing from google-cloud-dns/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai-v1/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai-v1beta3/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai-v1beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-document_ai/.repo-metadata.json * api_shortname field missing from google-cloud-document_ai/.repo-metadata.json * must have required property 'release_level' in google-cloud-domains-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-domains-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-domains/.repo-metadata.json * api_shortname field missing from google-cloud-domains/.repo-metadata.json * must have required property 'release_level' in google-cloud-error_reporting-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-error_reporting-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-error_reporting/.repo-metadata.json * must have required property 'release_level' in google-cloud-errors/.repo-metadata.json * must have required property 'release_level' in google-cloud-essential_contacts-v1/.repo-metadata.json * api_shortname field missing from google-cloud-essential_contacts-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-essential_contacts/.repo-metadata.json * api_shortname field missing from google-cloud-essential_contacts/.repo-metadata.json * must have required property 'release_level' in google-cloud-eventarc-v1/.repo-metadata.json * api_shortname field missing from google-cloud-eventarc-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-eventarc/.repo-metadata.json * api_shortname field missing from google-cloud-eventarc/.repo-metadata.json * must have required property 'release_level' in google-cloud-filestore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-filestore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-filestore/.repo-metadata.json * api_shortname field missing from google-cloud-filestore/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore-admin-v1/.repo-metadata.json * api_shortname field missing from google-cloud-firestore-admin-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-firestore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-firestore/.repo-metadata.json * api_shortname field missing from google-cloud-firestore/.repo-metadata.json * must have required property 'release_level' in google-cloud-functions-v1/.repo-metadata.json * api_shortname field missing from google-cloud-functions-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-functions/.repo-metadata.json * api_shortname field missing from google-cloud-functions/.repo-metadata.json * must have required property 'release_level' in google-cloud-gaming-v1/.repo-metadata.json * api_shortname field missing from google-cloud-gaming-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gaming/.repo-metadata.json * api_shortname field missing from google-cloud-gaming/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_connect-gateway/.repo-metadata.json * api_shortname field missing from google-cloud-gke_connect-gateway/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub-v1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-gke_hub/.repo-metadata.json * api_shortname field missing from google-cloud-gke_hub/.repo-metadata.json * must have required property 'release_level' in google-cloud-iap-v1/.repo-metadata.json * api_shortname field missing from google-cloud-iap-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-iap/.repo-metadata.json * api_shortname field missing from google-cloud-iap/.repo-metadata.json * must have required property 'release_level' in google-cloud-ids-v1/.repo-metadata.json * api_shortname field missing from google-cloud-ids-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-ids/.repo-metadata.json * api_shortname field missing from google-cloud-ids/.repo-metadata.json * must have required property 'release_level' in google-cloud-iot-v1/.repo-metadata.json * api_shortname field missing from google-cloud-iot-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-iot/.repo-metadata.json * api_shortname field missing from google-cloud-iot/.repo-metadata.json * must have required property 'release_level' in google-cloud-kms-v1/.repo-metadata.json * api_shortname field missing from google-cloud-kms-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-kms/.repo-metadata.json * api_shortname field missing from google-cloud-kms/.repo-metadata.json * must have required property 'release_level' in google-cloud-language-v1/.repo-metadata.json * api_shortname field missing from google-cloud-language-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-language-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-language-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-language/.repo-metadata.json * api_shortname field missing from google-cloud-language/.repo-metadata.json * must have required property 'release_level' in google-cloud-life_sciences-v2beta/.repo-metadata.json * api_shortname field missing from google-cloud-life_sciences-v2beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-life_sciences/.repo-metadata.json * api_shortname field missing from google-cloud-life_sciences/.repo-metadata.json * must have required property 'release_level' in google-cloud-location/.repo-metadata.json * api_shortname field missing from google-cloud-location/.repo-metadata.json * must have required property 'release_level' in google-cloud-logging-v2/.repo-metadata.json * api_shortname field missing from google-cloud-logging-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-logging/.repo-metadata.json * api_shortname field missing from google-cloud-logging/.repo-metadata.json * must have required property 'release_level' in google-cloud-managed_identities-v1/.repo-metadata.json * api_shortname field missing from google-cloud-managed_identities-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-managed_identities/.repo-metadata.json * api_shortname field missing from google-cloud-managed_identities/.repo-metadata.json * must have required property 'release_level' in google-cloud-media_translation-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-media_translation-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-media_translation/.repo-metadata.json * api_shortname field missing from google-cloud-media_translation/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache-v1/.repo-metadata.json * api_shortname field missing from google-cloud-memcache-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-memcache-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-memcache/.repo-metadata.json * api_shortname field missing from google-cloud-memcache/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore-v1/.repo-metadata.json * api_shortname field missing from google-cloud-metastore-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-metastore-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-metastore/.repo-metadata.json * api_shortname field missing from google-cloud-metastore/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-dashboard-v1/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-dashboard-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring-v3/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-monitoring/.repo-metadata.json * api_shortname field missing from google-cloud-monitoring/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity-v1/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity-v1alpha1/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity-v1alpha1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_connectivity/.repo-metadata.json * api_shortname field missing from google-cloud-network_connectivity/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_management-v1/.repo-metadata.json * api_shortname field missing from google-cloud-network_management-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_management/.repo-metadata.json * api_shortname field missing from google-cloud-network_management/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_security-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-network_security-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-network_security/.repo-metadata.json * api_shortname field missing from google-cloud-network_security/.repo-metadata.json * must have required property 'release_level' in google-cloud-notebooks-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-notebooks-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-notebooks/.repo-metadata.json * api_shortname field missing from google-cloud-notebooks/.repo-metadata.json * must have required property 'release_level' in google-cloud-orchestration-airflow-service-v1/.repo-metadata.json * api_shortname field missing from google-cloud-orchestration-airflow-service-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-orchestration-airflow-service/.repo-metadata.json * api_shortname field missing from google-cloud-orchestration-airflow-service/.repo-metadata.json * must have required property 'release_level' in google-cloud-org_policy-v2/.repo-metadata.json * api_shortname field missing from google-cloud-org_policy-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-org_policy/.repo-metadata.json * api_shortname field missing from google-cloud-org_policy/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config-v1/.repo-metadata.json * api_shortname field missing from google-cloud-os_config-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config-v1alpha/.repo-metadata.json * api_shortname field missing from google-cloud-os_config-v1alpha/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_config/.repo-metadata.json * api_shortname field missing from google-cloud-os_config/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login-v1/.repo-metadata.json * api_shortname field missing from google-cloud-os_login-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-os_login-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-os_login/.repo-metadata.json * api_shortname field missing from google-cloud-os_login/.repo-metadata.json * must have required property 'release_level' in google-cloud-phishing_protection-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-phishing_protection-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-phishing_protection/.repo-metadata.json * api_shortname field missing from google-cloud-phishing_protection/.repo-metadata.json * must have required property 'release_level' in google-cloud-policy_troubleshooter-v1/.repo-metadata.json * api_shortname field missing from google-cloud-policy_troubleshooter-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-policy_troubleshooter/.repo-metadata.json * api_shortname field missing from google-cloud-policy_troubleshooter/.repo-metadata.json * must have required property 'release_level' in google-cloud-private_catalog-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-private_catalog-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-private_catalog/.repo-metadata.json * api_shortname field missing from google-cloud-private_catalog/.repo-metadata.json * must have required property 'release_level' in google-cloud-profiler-v2/.repo-metadata.json * api_shortname field missing from google-cloud-profiler-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-profiler/.repo-metadata.json * api_shortname field missing from google-cloud-profiler/.repo-metadata.json * must have required property 'release_level' in google-cloud-pubsub-v1/.repo-metadata.json * api_shortname field missing from google-cloud-pubsub-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-pubsub/.repo-metadata.json * api_shortname field missing from google-cloud-pubsub/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recaptcha_enterprise/.repo-metadata.json * api_shortname field missing from google-cloud-recaptcha_enterprise/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommendation_engine-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-recommendation_engine-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommendation_engine/.repo-metadata.json * api_shortname field missing from google-cloud-recommendation_engine/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommender-v1/.repo-metadata.json * api_shortname field missing from google-cloud-recommender-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-recommender/.repo-metadata.json * api_shortname field missing from google-cloud-recommender/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis-v1/.repo-metadata.json * api_shortname field missing from google-cloud-redis-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-redis-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-redis/.repo-metadata.json * api_shortname field missing from google-cloud-redis/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_manager-v3/.repo-metadata.json * api_shortname field missing from google-cloud-resource_manager-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_manager/.repo-metadata.json * api_shortname field missing from google-cloud-resource_manager/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_settings-v1/.repo-metadata.json * api_shortname field missing from google-cloud-resource_settings-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-resource_settings/.repo-metadata.json * api_shortname field missing from google-cloud-resource_settings/.repo-metadata.json * must have required property 'release_level' in google-cloud-retail-v2/.repo-metadata.json * api_shortname field missing from google-cloud-retail-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-retail/.repo-metadata.json * api_shortname field missing from google-cloud-retail/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler-v1/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-scheduler/.repo-metadata.json * api_shortname field missing from google-cloud-scheduler/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager-v1/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-secret_manager/.repo-metadata.json * api_shortname field missing from google-cloud-secret_manager/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca-v1/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security-private_ca/.repo-metadata.json * api_shortname field missing from google-cloud-security-private_ca/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center-v1/.repo-metadata.json * api_shortname field missing from google-cloud-security_center-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-security_center-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-security_center/.repo-metadata.json * api_shortname field missing from google-cloud-security_center/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_control-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_control-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_control/.repo-metadata.json * api_shortname field missing from google-cloud-service_control/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_directory/.repo-metadata.json * api_shortname field missing from google-cloud-service_directory/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_management-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_management-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_management/.repo-metadata.json * api_shortname field missing from google-cloud-service_management/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_usage-v1/.repo-metadata.json * api_shortname field missing from google-cloud-service_usage-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-service_usage/.repo-metadata.json * api_shortname field missing from google-cloud-service_usage/.repo-metadata.json * must have required property 'release_level' in google-cloud-shell-v1/.repo-metadata.json * api_shortname field missing from google-cloud-shell-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-shell/.repo-metadata.json * api_shortname field missing from google-cloud-shell/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-admin-database-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-admin-database-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-admin-instance-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-admin-instance-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner-v1/.repo-metadata.json * api_shortname field missing from google-cloud-spanner-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-spanner/.repo-metadata.json * api_shortname field missing from google-cloud-spanner/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech-v1/.repo-metadata.json * api_shortname field missing from google-cloud-speech-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-speech-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-speech/.repo-metadata.json * api_shortname field missing from google-cloud-speech/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage/.repo-metadata.json * api_shortname field missing from google-cloud-storage/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage_transfer-v1/.repo-metadata.json * api_shortname field missing from google-cloud-storage_transfer-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-storage_transfer/.repo-metadata.json * api_shortname field missing from google-cloud-storage_transfer/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent-v4/.repo-metadata.json * api_shortname field missing from google-cloud-talent-v4/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent-v4beta1/.repo-metadata.json * api_shortname field missing from google-cloud-talent-v4beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-talent/.repo-metadata.json * api_shortname field missing from google-cloud-talent/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2beta2/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks-v2beta3/.repo-metadata.json * api_shortname field missing from google-cloud-tasks-v2beta3/.repo-metadata.json * must have required property 'release_level' in google-cloud-tasks/.repo-metadata.json * api_shortname field missing from google-cloud-tasks/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech-v1/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-text_to_speech/.repo-metadata.json * api_shortname field missing from google-cloud-text_to_speech/.repo-metadata.json * must have required property 'release_level' in google-cloud-tpu-v1/.repo-metadata.json * api_shortname field missing from google-cloud-tpu-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-tpu/.repo-metadata.json * api_shortname field missing from google-cloud-tpu/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace-v1/.repo-metadata.json * api_shortname field missing from google-cloud-trace-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace-v2/.repo-metadata.json * api_shortname field missing from google-cloud-trace-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-trace/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate-v2/.repo-metadata.json * api_shortname field missing from google-cloud-translate-v2/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate-v3/.repo-metadata.json * api_shortname field missing from google-cloud-translate-v3/.repo-metadata.json * must have required property 'release_level' in google-cloud-translate/.repo-metadata.json * api_shortname field missing from google-cloud-translate/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder-v1/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video-transcoder/.repo-metadata.json * api_shortname field missing from google-cloud-video-transcoder/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1beta2/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1beta2/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-video_intelligence/.repo-metadata.json * api_shortname field missing from google-cloud-video_intelligence/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1p3beta1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1p3beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision-v1p4beta1/.repo-metadata.json * api_shortname field missing from google-cloud-vision-v1p4beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vision/.repo-metadata.json * api_shortname field missing from google-cloud-vision/.repo-metadata.json * must have required property 'release_level' in google-cloud-vm_migration-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vm_migration-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vm_migration/.repo-metadata.json * api_shortname field missing from google-cloud-vm_migration/.repo-metadata.json * must have required property 'release_level' in google-cloud-vpc_access-v1/.repo-metadata.json * api_shortname field missing from google-cloud-vpc_access-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-vpc_access/.repo-metadata.json * api_shortname field missing from google-cloud-vpc_access/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk-v1/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk-v1beta1/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk-v1beta1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_risk/.repo-metadata.json * api_shortname field missing from google-cloud-web_risk/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner-v1/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-web_security_scanner/.repo-metadata.json * api_shortname field missing from google-cloud-web_security_scanner/.repo-metadata.json * must have required property 'release_level' in google-cloud-webrisk/.repo-metadata.json * api_shortname field missing from google-cloud-webrisk/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-executions-v1/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-executions-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-executions-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-executions-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-v1/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-v1/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows-v1beta/.repo-metadata.json * api_shortname field missing from google-cloud-workflows-v1beta/.repo-metadata.json * must have required property 'release_level' in google-cloud-workflows/.repo-metadata.json * api_shortname field missing from google-cloud-workflows/.repo-metadata.json * must have required property 'library_type' in google-cloud/.repo-metadata.json * must have required property 'release_level' in google-cloud/.repo-metadata.json * must have required property 'release_level' in google-iam-credentials-v1/.repo-metadata.json * api_shortname field missing from google-iam-credentials-v1/.repo-metadata.json * must have required property 'release_level' in google-iam-credentials/.repo-metadata.json * api_shortname field missing from google-iam-credentials/.repo-metadata.json * must have required property 'release_level' in google-iam-v1beta/.repo-metadata.json * api_shortname field missing from google-iam-v1beta/.repo-metadata.json * must have required property 'release_level' in google-identity-access_context_manager-v1/.repo-metadata.json * api_shortname field missing from google-identity-access_context_manager-v1/.repo-metadata.json * must have required property 'release_level' in google-identity-access_context_manager/.repo-metadata.json * api_shortname field missing from google-identity-access_context_manager/.repo-metadata.json * must have required property 'library_type' in grafeas-client/.repo-metadata.json * must have required property 'release_level' in grafeas-client/.repo-metadata.json * must have required property 'release_level' in grafeas-v1/.repo-metadata.json * api_shortname field missing from grafeas-v1/.repo-metadata.json * must have required property 'release_level' in grafeas/.repo-metadata.json * api_shortname field missing from grafeas/.repo-metadata.json * must have required property 'library_type' in stackdriver-core/.repo-metadata.json * must have required property 'release_level' in stackdriver-core/.repo-metadata.json * must have required property 'library_type' in stackdriver/.repo-metadata.json * must have required property 'release_level' in stackdriver/.repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 must have required property library type in gcloud repo metadata json must have required property release level in gcloud repo metadata json must have required property release level in google analytics admin repo metadata json api shortname field missing from google analytics admin repo metadata json must have required property release level in google analytics admin repo metadata json api shortname field missing from google analytics admin repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google tables repo metadata json api shortname field missing from google tables repo metadata json must have required property release level in google tables repo metadata json api shortname field missing from google tables repo metadata json must have required property release level in google cloud access approval repo metadata json api shortname field missing from google cloud access approval repo metadata json must have required property release level in google cloud access approval repo metadata json api shortname field missing from google cloud access approval repo metadata json must have required property release level in google cloud api gateway repo metadata json api shortname field missing from google cloud api gateway repo metadata json must have required property release level in google cloud api gateway repo metadata json api shortname field missing from google cloud api gateway repo metadata json must have required property release level in google cloud apigee connect repo metadata json api shortname field missing from google cloud apigee connect repo metadata json must have required property release level in google cloud apigee connect repo metadata json api shortname field missing from google cloud apigee connect repo metadata json must have required property release level in google cloud app engine repo metadata json api shortname field missing from google cloud app engine repo metadata json must have required property release level in google cloud app engine repo metadata json api shortname field missing from google cloud app engine repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud asset repo metadata json api shortname field missing from google cloud asset repo metadata json must have required property library type in google cloud asset repo metadata json must have required property release level in google cloud asset repo metadata json must have required property release level in google cloud asset repo metadata json api shortname field missing from google cloud asset repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud bigquery connection repo metadata json api shortname field missing from google cloud bigquery connection repo metadata json must have required property release level in google cloud bigquery connection repo metadata json api shortname field missing from google cloud bigquery connection repo metadata json must have required property release level in google cloud bigquery data transfer repo metadata json api shortname field missing from google cloud bigquery data transfer repo metadata json must have required property release level in google cloud bigquery data transfer repo metadata json api shortname field missing from google cloud bigquery data transfer repo metadata json must have required property release level in google cloud bigquery reservation repo metadata json api shortname field missing from google cloud bigquery reservation repo metadata json must have required property release level in google cloud bigquery reservation repo metadata json api shortname field missing from google cloud bigquery reservation repo metadata json must have required property release level in google cloud bigquery storage repo metadata json api shortname field missing from google cloud bigquery storage repo metadata json must have required property release level in google cloud bigquery storage repo metadata json api shortname field missing from google cloud bigquery storage repo metadata json must have required property release level in google cloud bigquery repo metadata json api shortname field missing from google cloud bigquery repo metadata json must have required property release level in google cloud bigtable admin repo metadata json api shortname field missing from google cloud bigtable admin repo metadata json must have required property release level in google cloud bigtable repo metadata json api shortname field missing from google cloud bigtable repo metadata json must have required property release level in google cloud bigtable repo metadata json api shortname field missing from google cloud bigtable repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing repo metadata json api shortname field missing from google cloud billing repo metadata json must have required property release level in google cloud billing repo metadata json api shortname field missing from google cloud billing repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud build repo metadata json api shortname field missing from google cloud build repo metadata json must have required property release level in google cloud build repo metadata json api shortname field missing from google cloud build repo metadata json must have required property release level in google cloud channel repo metadata json api shortname field missing from google cloud channel repo metadata json must have required property release level in google cloud channel repo metadata json api shortname field missing from google cloud channel repo metadata json must have required property release level in google cloud cloud dms repo metadata json api shortname field missing from google cloud cloud dms repo metadata json must have required property release level in google cloud cloud dms repo metadata json api shortname field missing from google cloud cloud dms repo metadata json must have required property release level in google cloud compute repo metadata json api shortname field missing from google cloud compute repo metadata json must have required property release level in google cloud contact center insights repo metadata json api shortname field missing from google cloud contact center insights repo metadata json must have required property release level in google cloud contact center insights repo metadata json api shortname field missing from google cloud contact center insights repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container analysis repo metadata json api shortname field missing from google cloud container analysis repo metadata json must have required property release level in google cloud container analysis repo metadata json api shortname field missing from google cloud container analysis repo metadata json must have required property release level in google cloud core repo metadata json must have required property release level in google cloud data catalog repo metadata json api shortname field missing from google cloud data catalog repo metadata json must have required property release level in google cloud data catalog repo metadata json api shortname field missing from google cloud data catalog repo metadata json must have required property release level in google cloud data fusion repo metadata json api shortname field missing from google cloud data fusion repo metadata json must have required property release level in google cloud data fusion repo metadata json api shortname field missing from google cloud data fusion repo metadata json must have required property release level in google cloud data labeling repo metadata json api shortname field missing from google cloud data labeling repo metadata json must have required property release level in google cloud data labeling repo metadata json api shortname field missing from google cloud data labeling repo metadata json must have required property release level in google cloud dataflow repo metadata json api shortname field missing from google cloud dataflow repo metadata json must have required property release level in google cloud dataflow repo metadata json api shortname field missing from google cloud dataflow repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataqna repo metadata json api shortname field missing from google cloud dataqna repo metadata json must have required property release level in google cloud dataqna repo metadata json api shortname field missing from google cloud dataqna repo metadata json must have required property release level in google cloud datastore admin repo metadata json api shortname field missing from google cloud datastore admin repo metadata json must have required property release level in google cloud datastore repo metadata json api shortname field missing from google cloud datastore repo metadata json must have required property release level in google cloud datastore repo metadata json api shortname field missing from google cloud datastore repo metadata json must have required property release level in google cloud datastream repo metadata json api shortname field missing from google cloud datastream repo metadata json must have required property release level in google cloud datastream repo metadata json api shortname field missing from google cloud datastream repo metadata json must have required property release level in google cloud debugger repo metadata json api shortname field missing from google cloud debugger repo metadata json must have required property release level in google cloud debugger repo metadata json must have required property release level in google cloud deploy repo metadata json api shortname field missing from google cloud deploy repo metadata json must have required property release level in google cloud deploy repo metadata json api shortname field missing from google cloud deploy repo metadata json must have required property release level in google cloud dialogflow cx repo metadata json api shortname field missing from google cloud dialogflow cx repo metadata json must have required property release level in google cloud dialogflow cx repo metadata json api shortname field missing from google cloud dialogflow cx repo metadata json must have required property release level in google cloud dialogflow repo metadata json api shortname field missing from google cloud dialogflow repo metadata json must have required property release level in google cloud dialogflow repo metadata json api shortname field missing from google cloud dialogflow repo metadata json must have required property release level in google cloud dlp repo metadata json api shortname field missing from google cloud dlp repo metadata json must have required property release level in google cloud dlp repo metadata json api shortname field missing from google cloud dlp repo metadata json must have required property release level in google cloud dns repo metadata json api shortname field missing from google cloud dns repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud domains repo metadata json api shortname field missing from google cloud domains repo metadata json must have required property release level in google cloud domains repo metadata json api shortname field missing from google cloud domains repo metadata json must have required property release level in google cloud error reporting repo metadata json api shortname field missing from google cloud error reporting repo metadata json must have required property release level in google cloud error reporting repo metadata json must have required property release level in google cloud errors repo metadata json must have required property release level in google cloud essential contacts repo metadata json api shortname field missing from google cloud essential contacts repo metadata json must have required property release level in google cloud essential contacts repo metadata json api shortname field missing from google cloud essential contacts repo metadata json must have required property release level in google cloud eventarc repo metadata json api shortname field missing from google cloud eventarc repo metadata json must have required property release level in google cloud eventarc repo metadata json api shortname field missing from google cloud eventarc repo metadata json must have required property release level in google cloud filestore repo metadata json api shortname field missing from google cloud filestore repo metadata json must have required property release level in google cloud filestore repo metadata json api shortname field missing from google cloud filestore repo metadata json must have required property release level in google cloud firestore admin repo metadata json api shortname field missing from google cloud firestore admin repo metadata json must have required property release level in google cloud firestore repo metadata json api shortname field missing from google cloud firestore repo metadata json must have required property release level in google cloud firestore repo metadata json api shortname field missing from google cloud firestore repo metadata json must have required property release level in google cloud functions repo metadata json api shortname field missing from google cloud functions repo metadata json must have required property release level in google cloud functions repo metadata json api shortname field missing from google cloud functions repo metadata json must have required property release level in google cloud gaming repo metadata json api shortname field missing from google cloud gaming repo metadata json must have required property release level in google cloud gaming repo metadata json api shortname field missing from google cloud gaming repo metadata json must have required property release level in google cloud gke connect gateway repo metadata json api shortname field missing from google cloud gke connect gateway repo metadata json must have required property release level in google cloud gke connect gateway repo metadata json api shortname field missing from google cloud gke connect gateway repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud iap repo metadata json api shortname field missing from google cloud iap repo metadata json must have required property release level in google cloud iap repo metadata json api shortname field missing from google cloud iap repo metadata json must have required property release level in google cloud ids repo metadata json api shortname field missing from google cloud ids repo metadata json must have required property release level in google cloud ids repo metadata json api shortname field missing from google cloud ids repo metadata json must have required property release level in google cloud iot repo metadata json api shortname field missing from google cloud iot repo metadata json must have required property release level in google cloud iot repo metadata json api shortname field missing from google cloud iot repo metadata json must have required property release level in google cloud kms repo metadata json api shortname field missing from google cloud kms repo metadata json must have required property release level in google cloud kms repo metadata json api shortname field missing from google cloud kms repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud life sciences repo metadata json api shortname field missing from google cloud life sciences repo metadata json must have required property release level in google cloud life sciences repo metadata json api shortname field missing from google cloud life sciences repo metadata json must have required property release level in google cloud location repo metadata json api shortname field missing from google cloud location repo metadata json must have required property release level in google cloud logging repo metadata json api shortname field missing from google cloud logging repo metadata json must have required property release level in google cloud logging repo metadata json api shortname field missing from google cloud logging repo metadata json must have required property release level in google cloud managed identities repo metadata json api shortname field missing from google cloud managed identities repo metadata json must have required property release level in google cloud managed identities repo metadata json api shortname field missing from google cloud managed identities repo metadata json must have required property release level in google cloud media translation repo metadata json api shortname field missing from google cloud media translation repo metadata json must have required property release level in google cloud media translation repo metadata json api shortname field missing from google cloud media translation repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud monitoring dashboard repo metadata json api shortname field missing from google cloud monitoring dashboard repo metadata json must have required property release level in google cloud monitoring metrics scope repo metadata json api shortname field missing from google cloud monitoring metrics scope repo metadata json must have required property release level in google cloud monitoring repo metadata json api shortname field missing from google cloud monitoring repo metadata json must have required property release level in google cloud monitoring repo metadata json api shortname field missing from google cloud monitoring repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network management repo metadata json api shortname field missing from google cloud network management repo metadata json must have required property release level in google cloud network management repo metadata json api shortname field missing from google cloud network management repo metadata json must have required property release level in google cloud network security repo metadata json api shortname field missing from google cloud network security repo metadata json must have required property release level in google cloud network security repo metadata json api shortname field missing from google cloud network security repo metadata json must have required property release level in google cloud notebooks repo metadata json api shortname field missing from google cloud notebooks repo metadata json must have required property release level in google cloud notebooks repo metadata json api shortname field missing from google cloud notebooks repo metadata json must have required property release level in google cloud orchestration airflow service repo metadata json api shortname field missing from google cloud orchestration airflow service repo metadata json must have required property release level in google cloud orchestration airflow service repo metadata json api shortname field missing from google cloud orchestration airflow service repo metadata json must have required property release level in google cloud org policy repo metadata json api shortname field missing from google cloud org policy repo metadata json must have required property release level in google cloud org policy repo metadata json api shortname field missing from google cloud org policy repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud phishing protection repo metadata json api shortname field missing from google cloud phishing protection repo metadata json must have required property release level in google cloud phishing protection repo metadata json api shortname field missing from google cloud phishing protection repo metadata json must have required property release level in google cloud policy troubleshooter repo metadata json api shortname field missing from google cloud policy troubleshooter repo metadata json must have required property release level in google cloud policy troubleshooter repo metadata json api shortname field missing from google cloud policy troubleshooter repo metadata json must have required property release level in google cloud private catalog repo metadata json api shortname field missing from google cloud private catalog repo metadata json must have required property release level in google cloud private catalog repo metadata json api shortname field missing from google cloud private catalog repo metadata json must have required property release level in google cloud profiler repo metadata json api shortname field missing from google cloud profiler repo metadata json must have required property release level in google cloud profiler repo metadata json api shortname field missing from google cloud profiler repo metadata json must have required property release level in google cloud pubsub repo metadata json api shortname field missing from google cloud pubsub repo metadata json must have required property release level in google cloud pubsub repo metadata json api shortname field missing from google cloud pubsub repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recommendation engine repo metadata json api shortname field missing from google cloud recommendation engine repo metadata json must have required property release level in google cloud recommendation engine repo metadata json api shortname field missing from google cloud recommendation engine repo metadata json must have required property release level in google cloud recommender repo metadata json api shortname field missing from google cloud recommender repo metadata json must have required property release level in google cloud recommender repo metadata json api shortname field missing from google cloud recommender repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud resource manager repo metadata json api shortname field missing from google cloud resource manager repo metadata json must have required property release level in google cloud resource manager repo metadata json api shortname field missing from google cloud resource manager repo metadata json must have required property release level in google cloud resource settings repo metadata json api shortname field missing from google cloud resource settings repo metadata json must have required property release level in google cloud resource settings repo metadata json api shortname field missing from google cloud resource settings repo metadata json must have required property release level in google cloud retail repo metadata json api shortname field missing from google cloud retail repo metadata json must have required property release level in google cloud retail repo metadata json api shortname field missing from google cloud retail repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud service control repo metadata json api shortname field missing from google cloud service control repo metadata json must have required property release level in google cloud service control repo metadata json api shortname field missing from google cloud service control repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service management repo metadata json api shortname field missing from google cloud service management repo metadata json must have required property release level in google cloud service management repo metadata json api shortname field missing from google cloud service management repo metadata json must have required property release level in google cloud service usage repo metadata json api shortname field missing from google cloud service usage repo metadata json must have required property release level in google cloud service usage repo metadata json api shortname field missing from google cloud service usage repo metadata json must have required property release level in google cloud shell repo metadata json api shortname field missing from google cloud shell repo metadata json must have required property release level in google cloud shell repo metadata json api shortname field missing from google cloud shell repo metadata json must have required property release level in google cloud spanner admin database repo metadata json api shortname field missing from google cloud spanner admin database repo metadata json must have required property release level in google cloud spanner admin instance repo metadata json api shortname field missing from google cloud spanner admin instance repo metadata json must have required property release level in google cloud spanner repo metadata json api shortname field missing from google cloud spanner repo metadata json must have required property release level in google cloud spanner repo metadata json api shortname field missing from google cloud spanner repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud storage repo metadata json api shortname field missing from google cloud storage repo metadata json must have required property release level in google cloud storage transfer repo metadata json api shortname field missing from google cloud storage transfer repo metadata json must have required property release level in google cloud storage transfer repo metadata json api shortname field missing from google cloud storage transfer repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud tpu repo metadata json api shortname field missing from google cloud tpu repo metadata json must have required property release level in google cloud tpu repo metadata json api shortname field missing from google cloud tpu repo metadata json must have required property release level in google cloud trace repo metadata json api shortname field missing from google cloud trace repo metadata json must have required property release level in google cloud trace repo metadata json api shortname field missing from google cloud trace repo metadata json must have required property release level in google cloud trace repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vm migration repo metadata json api shortname field missing from google cloud vm migration repo metadata json must have required property release level in google cloud vm migration repo metadata json api shortname field missing from google cloud vm migration repo metadata json must have required property release level in google cloud vpc access repo metadata json api shortname field missing from google cloud vpc access repo metadata json must have required property release level in google cloud vpc access repo metadata json api shortname field missing from google cloud vpc access repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud webrisk repo metadata json api shortname field missing from google cloud webrisk repo metadata json must have required property release level in google cloud workflows executions repo metadata json api shortname field missing from google cloud workflows executions repo metadata json must have required property release level in google cloud workflows executions repo metadata json api shortname field missing from google cloud workflows executions repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property library type in google cloud repo metadata json must have required property release level in google cloud repo metadata json must have required property release level in google iam credentials repo metadata json api shortname field missing from google iam credentials repo metadata json must have required property release level in google iam credentials repo metadata json api shortname field missing from google iam credentials repo metadata json must have required property release level in google iam repo metadata json api shortname field missing from google iam repo metadata json must have required property release level in google identity access context manager repo metadata json api shortname field missing from google identity access context manager repo metadata json must have required property release level in google identity access context manager repo metadata json api shortname field missing from google identity access context manager repo metadata json must have required property library type in grafeas client repo metadata json must have required property release level in grafeas client repo metadata json must have required property release level in grafeas repo metadata json api shortname field missing from grafeas repo metadata json must have required property release level in grafeas repo metadata json api shortname field missing from grafeas repo metadata json must have required property library type in stackdriver core repo metadata json must have required property release level in stackdriver core repo metadata json must have required property library type in stackdriver repo metadata json must have required property release level in stackdriver repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
1
7,960
11,137,569,523
IssuesEvent
2019-12-20 19:43:28
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
closed
View applicant - sort by
Apply Process Requirements Ready
Who: Community and sitewide admins What: ability to sort on applicant lists Why: in order to Acceptance criteria: - Add a sort by to the applicant listing - Default sort will be by last name - Sort by options will be for all fields on the table: Name, email, Status, and Last update Related tickets: 4049 - create applicant list 4130 - Filter on applicant list 4133 - Sort by on applicant list 4134 - .csv of applicants
1.0
View applicant - sort by - Who: Community and sitewide admins What: ability to sort on applicant lists Why: in order to Acceptance criteria: - Add a sort by to the applicant listing - Default sort will be by last name - Sort by options will be for all fields on the table: Name, email, Status, and Last update Related tickets: 4049 - create applicant list 4130 - Filter on applicant list 4133 - Sort by on applicant list 4134 - .csv of applicants
process
view applicant sort by who community and sitewide admins what ability to sort on applicant lists why in order to acceptance criteria add a sort by to the applicant listing default sort will be by last name sort by options will be for all fields on the table name email status and last update related tickets create applicant list filter on applicant list sort by on applicant list csv of applicants
1
16,790
22,036,471,942
IssuesEvent
2022-05-28 17:08:40
bow-simulation/virtualbow
https://api.github.com/repos/bow-simulation/virtualbow
closed
Clang vs GCC on MacOS
area: fem/numerics platform: macos area: software process type: idea
Compare simulation performance and decide whether to keep the release builds on Clang or switch to GCC.
1.0
Clang vs GCC on MacOS - Compare simulation performance and decide whether to keep the release builds on Clang or switch to GCC.
process
clang vs gcc on macos compare simulation performance and decide whether to keep the release builds on clang or switch to gcc
1
22,585
31,811,063,439
IssuesEvent
2023-09-13 16:52:11
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
NTR : [RNA alternative polyadenylation]
New term request RNA processes
_Please provide as much information as you can:_ * **Suggested term label:** RNA alternative polyadenylation parent of GO:0110104 mRNA alternative polyadenylation
1.0
NTR : [RNA alternative polyadenylation] - _Please provide as much information as you can:_ * **Suggested term label:** RNA alternative polyadenylation parent of GO:0110104 mRNA alternative polyadenylation
process
ntr please provide as much information as you can suggested term label rna alternative polyadenylation parent of go mrna alternative polyadenylation
1
11,590
14,447,052,223
IssuesEvent
2020-12-08 02:48:33
googleapis/google-api-python-client
https://api.github.com/repos/googleapis/google-api-python-client
closed
Change dependency from pycrypto
type: process
**Is your feature request related to a problem? Please describe.** Pycrypto has not been maintained since 2014. So it susceptible to vulnerabilities like CVE-2013-7459 and CVE-2018-6594, and lacks some compatibility with the latest versions of python. Thus should not be in the tox file, documentation, etc. **Describe the solution you'd like** Replace pycrypto with an active, well-maintained python crypto library for development and testing **Describe alternatives you've considered** `pycryptodome` is "an almost drop-in replacement for the old PyCrypto library", but other libraries can still be considered.
1.0
Change dependency from pycrypto - **Is your feature request related to a problem? Please describe.** Pycrypto has not been maintained since 2014. So it susceptible to vulnerabilities like CVE-2013-7459 and CVE-2018-6594, and lacks some compatibility with the latest versions of python. Thus should not be in the tox file, documentation, etc. **Describe the solution you'd like** Replace pycrypto with an active, well-maintained python crypto library for development and testing **Describe alternatives you've considered** `pycryptodome` is "an almost drop-in replacement for the old PyCrypto library", but other libraries can still be considered.
process
change dependency from pycrypto is your feature request related to a problem please describe pycrypto has not been maintained since so it susceptible to vulnerabilities like cve and cve and lacks some compatibility with the latest versions of python thus should not be in the tox file documentation etc describe the solution you d like replace pycrypto with an active well maintained python crypto library for development and testing describe alternatives you ve considered pycryptodome is an almost drop in replacement for the old pycrypto library but other libraries can still be considered
1
14,920
3,295,034,281
IssuesEvent
2015-10-31 15:51:41
devtees/shirt-ideas
https://api.github.com/repos/devtees/shirt-ideas
closed
Designerd
confirmed for next run design stage
My homie @mdo uses the term "designerd" a lot, and it's a lovely label. Working on a typography based tee concept for those who wish to proudly display their designerdery. ![image](https://cloud.githubusercontent.com/assets/1319791/10109212/b0b16c1c-6393-11e5-8807-28fe18d7fbef.png) ![image](https://cloud.githubusercontent.com/assets/1319791/10109218/b83fc62c-6393-11e5-82b0-34d93f4e9d76.png)
1.0
Designerd - My homie @mdo uses the term "designerd" a lot, and it's a lovely label. Working on a typography based tee concept for those who wish to proudly display their designerdery. ![image](https://cloud.githubusercontent.com/assets/1319791/10109212/b0b16c1c-6393-11e5-8807-28fe18d7fbef.png) ![image](https://cloud.githubusercontent.com/assets/1319791/10109218/b83fc62c-6393-11e5-82b0-34d93f4e9d76.png)
non_process
designerd my homie mdo uses the term designerd a lot and it s a lovely label working on a typography based tee concept for those who wish to proudly display their designerdery
0
13,221
15,690,499,872
IssuesEvent
2021-03-25 16:47:40
hasura/ask-me-anything
https://api.github.com/repos/hasura/ask-me-anything
opened
When using `hasura console` and making metadata changes, how will it determine which file to modify?
processing-for-shortvid question
Per @scriptonist "when using `hasura console` and making metadata changes, how will it determine which file to modify?"
1.0
When using `hasura console` and making metadata changes, how will it determine which file to modify? - Per @scriptonist "when using `hasura console` and making metadata changes, how will it determine which file to modify?"
process
when using hasura console and making metadata changes how will it determine which file to modify per scriptonist when using hasura console and making metadata changes how will it determine which file to modify
1
7,751
10,866,131,845
IssuesEvent
2019-11-14 20:31:59
microsoft/ptvsd
https://api.github.com/repos/microsoft/ptvsd
opened
Sporadic multiproc test failures due to Popen bug
Bug area:Multiprocessing
This is a Python bug: https://bugs.python.org/issue37380 TL;DR: when a `subprocess.Popen` instance is GC'd, the process handle gets closed, but it is not removed from "active processes" list if the process in question was still running at the time. At some later point, attempting to create a _new_ `Popen` instance tries to clean up the active process list first thing in its `__init__` (invoking `subprocess._cleanup`, which invokes `subprocess._internal_poll` on everything in `subprocess._active`), and fails with "OSError: [WinError 6] The handle is invalid". This is not recoverable - from that point on, all attempts to spawn new subprocesses in that process will fail. This is fixed in Python 3.9, and backported to 3.8 and 3.7. However, 2.7, 3.5, and 3.6 are all still susceptible to this. In our test runs, this affects tests where the debuggee uses `subprocess` - `test_subprocess`, obviously, but also all the Flask tests. The likely culprit is our use of `Popen` in `ptvsd.enable_attach()`, where we spawn the adapter. The adapter remains alive for as long as debuggee is, by design, but the `Popen` instance is referenced by a local, and goes away immediately. If so, we can keep it alive for longer by referencing it from a global.
1.0
Sporadic multiproc test failures due to Popen bug - This is a Python bug: https://bugs.python.org/issue37380 TL;DR: when a `subprocess.Popen` instance is GC'd, the process handle gets closed, but it is not removed from "active processes" list if the process in question was still running at the time. At some later point, attempting to create a _new_ `Popen` instance tries to clean up the active process list first thing in its `__init__` (invoking `subprocess._cleanup`, which invokes `subprocess._internal_poll` on everything in `subprocess._active`), and fails with "OSError: [WinError 6] The handle is invalid". This is not recoverable - from that point on, all attempts to spawn new subprocesses in that process will fail. This is fixed in Python 3.9, and backported to 3.8 and 3.7. However, 2.7, 3.5, and 3.6 are all still susceptible to this. In our test runs, this affects tests where the debuggee uses `subprocess` - `test_subprocess`, obviously, but also all the Flask tests. The likely culprit is our use of `Popen` in `ptvsd.enable_attach()`, where we spawn the adapter. The adapter remains alive for as long as debuggee is, by design, but the `Popen` instance is referenced by a local, and goes away immediately. If so, we can keep it alive for longer by referencing it from a global.
process
sporadic multiproc test failures due to popen bug this is a python bug tl dr when a subprocess popen instance is gc d the process handle gets closed but it is not removed from active processes list if the process in question was still running at the time at some later point attempting to create a new popen instance tries to clean up the active process list first thing in its init invoking subprocess cleanup which invokes subprocess internal poll on everything in subprocess active and fails with oserror the handle is invalid this is not recoverable from that point on all attempts to spawn new subprocesses in that process will fail this is fixed in python and backported to and however and are all still susceptible to this in our test runs this affects tests where the debuggee uses subprocess test subprocess obviously but also all the flask tests the likely culprit is our use of popen in ptvsd enable attach where we spawn the adapter the adapter remains alive for as long as debuggee is by design but the popen instance is referenced by a local and goes away immediately if so we can keep it alive for longer by referencing it from a global
1
4,838
7,734,323,331
IssuesEvent
2018-05-26 23:05:19
monarch-games/nanoshooter
https://api.github.com/repos/monarch-games/nanoshooter
closed
Rambo needs to make his first commit
Process Workable Writing
It just needs to happen ## Criteria - [ ] Rambo checks out a new local branch off `master`. - [ ] Rambo scours the README.md and finds a typo to fix or improvement to make, and commits the change(s) to his local branch. - [ ] Rambo pushes his local branch to this GitHub project. - [ ] Rambo creates a pull request. - [ ] Rambo's pull request is reviewed and merged to `master`.
1.0
Rambo needs to make his first commit - It just needs to happen ## Criteria - [ ] Rambo checks out a new local branch off `master`. - [ ] Rambo scours the README.md and finds a typo to fix or improvement to make, and commits the change(s) to his local branch. - [ ] Rambo pushes his local branch to this GitHub project. - [ ] Rambo creates a pull request. - [ ] Rambo's pull request is reviewed and merged to `master`.
process
rambo needs to make his first commit it just needs to happen criteria rambo checks out a new local branch off master rambo scours the readme md and finds a typo to fix or improvement to make and commits the change s to his local branch rambo pushes his local branch to this github project rambo creates a pull request rambo s pull request is reviewed and merged to master
1
477,331
13,760,128,132
IssuesEvent
2020-10-07 05:03:29
AY2021S1-CS2103T-T09-4/tp
https://api.github.com/repos/AY2021S1-CS2103T-T09-4/tp
opened
Data Tracking
Priority.High type.DG type.Function
Track relevant user data such as: 1. Number of correctly answered flashcards 2. Previous scores 3. Average time taken in total 4. Time of quiz As well as the following subtasks: 1. Tests 2. Update DG
1.0
Data Tracking - Track relevant user data such as: 1. Number of correctly answered flashcards 2. Previous scores 3. Average time taken in total 4. Time of quiz As well as the following subtasks: 1. Tests 2. Update DG
non_process
data tracking track relevant user data such as number of correctly answered flashcards previous scores average time taken in total time of quiz as well as the following subtasks tests update dg
0
1,759
4,462,267,319
IssuesEvent
2016-08-24 09:18:10
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
opened
Introduce `database.identifiers` table
API Processors
# Overview For now we handle identifiers in `database.records.identifiers` jsonb dict and it's the most important part of our deduplication system. We have following problems with it: - can't have a few identifiers from one source (because it's dict) - should search using query like `identifiers @> '{"nct": "NCT124123423"}' (we need key+value) because of the way how Postgres GIN indexes works. Introducing normalized `database.identifiers` table will solve this problems and open some other possibilities to work with identifiers.
1.0
Introduce `database.identifiers` table - # Overview For now we handle identifiers in `database.records.identifiers` jsonb dict and it's the most important part of our deduplication system. We have following problems with it: - can't have a few identifiers from one source (because it's dict) - should search using query like `identifiers @> '{"nct": "NCT124123423"}' (we need key+value) because of the way how Postgres GIN indexes works. Introducing normalized `database.identifiers` table will solve this problems and open some other possibilities to work with identifiers.
process
introduce database identifiers table overview for now we handle identifiers in database records identifiers jsonb dict and it s the most important part of our deduplication system we have following problems with it can t have a few identifiers from one source because it s dict should search using query like identifiers nct we need key value because of the way how postgres gin indexes works introducing normalized database identifiers table will solve this problems and open some other possibilities to work with identifiers
1
17,267
23,049,644,743
IssuesEvent
2022-07-24 12:38:14
googleapis/python-pubsublite
https://api.github.com/repos/googleapis/python-pubsublite
closed
Add owl-bot as a required check
type: process api: pubsublite
Owl bot failed in #342 and the PR should not have been merged. This issue tracks adding owl bot as a required check.
1.0
Add owl-bot as a required check - Owl bot failed in #342 and the PR should not have been merged. This issue tracks adding owl bot as a required check.
process
add owl bot as a required check owl bot failed in and the pr should not have been merged this issue tracks adding owl bot as a required check
1
167,305
14,107,666,021
IssuesEvent
2020-11-06 16:37:29
pypa/bandersnatch
https://api.github.com/repos/pypa/bandersnatch
opened
Remove mentions of blacklist / whitelist from documentation
documentation
Change to denylist / allowlist everywhere.
1.0
Remove mentions of blacklist / whitelist from documentation - Change to denylist / allowlist everywhere.
non_process
remove mentions of blacklist whitelist from documentation change to denylist allowlist everywhere
0
6,681
9,799,151,577
IssuesEvent
2019-06-11 13:52:57
GoogleCloudPlatform/golang-samples
https://api.github.com/repos/GoogleCloudPlatform/golang-samples
closed
testing: build failed but marked as successful
Fixit type: process
http://sponge/805c10c0-1a89-4150-be16-9096a46cb569 ``` + tee gotest.out appengine/gophers/gophers-6/main.go:26:2: cannot find package "github.com/gofrs/uuid" in any of: /usr/local/go/src/github.com/gofrs/uuid (from $GOROOT) /tmpfs/src/gopath/src/github.com/gofrs/uuid (from $GOPATH) asset/quickstart/batch-get-assets-history/main.go:16:2: cannot find package "cloud.google.com/go/asset/v1beta1" in any of: /usr/local/go/src/cloud.google.com/go/asset/v1beta1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/asset/v1beta1 (from $GOPATH) asset/quickstart/batch-get-assets-history/main.go:19:2: cannot find package "google.golang.org/genproto/googleapis/cloud/asset/v1beta1" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/cloud/asset/v1beta1 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/cloud/asset/v1beta1 (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:14:2: cannot find package "cloud.google.com/go/containeranalysis/apiv1beta1" in any of: /usr/local/go/src/cloud.google.com/go/containeranalysis/apiv1beta1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/containeranalysis/apiv1beta1 (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:17:2: cannot find package "google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:18:2: cannot find package "google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability (from $GOPATH) jobs/v2/quickstart/main.go:16:2: cannot find package "google.golang.org/api/jobs/v2" in any of: /usr/local/go/src/google.golang.org/api/jobs/v2 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/api/jobs/v2 (from $GOPATH) jobs/v3/howto/auto_complete_sample.go:13:2: cannot find package "google.golang.org/api/jobs/v3" in any of: /usr/local/go/src/google.golang.org/api/jobs/v3 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/api/jobs/v3 (from $GOPATH) opencensus/metrics_quickstart/main.go:17:2: cannot find package "contrib.go.opencensus.io/exporter/stackdriver" in any of: /usr/local/go/src/contrib.go.opencensus.io/exporter/stackdriver (from $GOROOT) /tmpfs/src/gopath/src/contrib.go.opencensus.io/exporter/stackdriver (from $GOPATH) texttospeech/list_voices/list_voices.go:15:2: cannot find package "cloud.google.com/go/texttospeech/apiv1" in any of: /usr/local/go/src/cloud.google.com/go/texttospeech/apiv1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/texttospeech/apiv1 (from $GOPATH) texttospeech/list_voices/list_voices.go:16:2: cannot find package "google.golang.org/genproto/googleapis/cloud/texttospeech/v1" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/cloud/texttospeech/v1 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/cloud/texttospeech/v1 (from $GOPATH) trace/trace_quickstart/main.go:16:2: cannot find package "contrib.go.opencensus.io/exporter/stackdriver/propagation" in any of: /usr/local/go/src/contrib.go.opencensus.io/exporter/stackdriver/propagation (from $GOROOT) /tmpfs/src/gopath/src/contrib.go.opencensus.io/exporter/stackdriver/propagation (from $GOPATH) + cat gotest.out + /tmpfs/src/gopath/bin/go-junit-report -set-exit-code + gimmeproj -project golang-samples-tests done golang-samples-tests-6 Returned golang-samples-tests-6 to the pool. [ID: 1102742] Build finished after 31 secs, exit value: 0 ```
1.0
testing: build failed but marked as successful - http://sponge/805c10c0-1a89-4150-be16-9096a46cb569 ``` + tee gotest.out appengine/gophers/gophers-6/main.go:26:2: cannot find package "github.com/gofrs/uuid" in any of: /usr/local/go/src/github.com/gofrs/uuid (from $GOROOT) /tmpfs/src/gopath/src/github.com/gofrs/uuid (from $GOPATH) asset/quickstart/batch-get-assets-history/main.go:16:2: cannot find package "cloud.google.com/go/asset/v1beta1" in any of: /usr/local/go/src/cloud.google.com/go/asset/v1beta1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/asset/v1beta1 (from $GOPATH) asset/quickstart/batch-get-assets-history/main.go:19:2: cannot find package "google.golang.org/genproto/googleapis/cloud/asset/v1beta1" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/cloud/asset/v1beta1 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/cloud/asset/v1beta1 (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:14:2: cannot find package "cloud.google.com/go/containeranalysis/apiv1beta1" in any of: /usr/local/go/src/cloud.google.com/go/containeranalysis/apiv1beta1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/containeranalysis/apiv1beta1 (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:17:2: cannot find package "google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/grafeas (from $GOPATH) container_registry/container_analysis/src/sample/samples.go:18:2: cannot find package "google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/devtools/containeranalysis/v1beta1/vulnerability (from $GOPATH) jobs/v2/quickstart/main.go:16:2: cannot find package "google.golang.org/api/jobs/v2" in any of: /usr/local/go/src/google.golang.org/api/jobs/v2 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/api/jobs/v2 (from $GOPATH) jobs/v3/howto/auto_complete_sample.go:13:2: cannot find package "google.golang.org/api/jobs/v3" in any of: /usr/local/go/src/google.golang.org/api/jobs/v3 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/api/jobs/v3 (from $GOPATH) opencensus/metrics_quickstart/main.go:17:2: cannot find package "contrib.go.opencensus.io/exporter/stackdriver" in any of: /usr/local/go/src/contrib.go.opencensus.io/exporter/stackdriver (from $GOROOT) /tmpfs/src/gopath/src/contrib.go.opencensus.io/exporter/stackdriver (from $GOPATH) texttospeech/list_voices/list_voices.go:15:2: cannot find package "cloud.google.com/go/texttospeech/apiv1" in any of: /usr/local/go/src/cloud.google.com/go/texttospeech/apiv1 (from $GOROOT) /tmpfs/src/gopath/src/cloud.google.com/go/texttospeech/apiv1 (from $GOPATH) texttospeech/list_voices/list_voices.go:16:2: cannot find package "google.golang.org/genproto/googleapis/cloud/texttospeech/v1" in any of: /usr/local/go/src/google.golang.org/genproto/googleapis/cloud/texttospeech/v1 (from $GOROOT) /tmpfs/src/gopath/src/google.golang.org/genproto/googleapis/cloud/texttospeech/v1 (from $GOPATH) trace/trace_quickstart/main.go:16:2: cannot find package "contrib.go.opencensus.io/exporter/stackdriver/propagation" in any of: /usr/local/go/src/contrib.go.opencensus.io/exporter/stackdriver/propagation (from $GOROOT) /tmpfs/src/gopath/src/contrib.go.opencensus.io/exporter/stackdriver/propagation (from $GOPATH) + cat gotest.out + /tmpfs/src/gopath/bin/go-junit-report -set-exit-code + gimmeproj -project golang-samples-tests done golang-samples-tests-6 Returned golang-samples-tests-6 to the pool. [ID: 1102742] Build finished after 31 secs, exit value: 0 ```
process
testing build failed but marked as successful tee gotest out appengine gophers gophers main go cannot find package github com gofrs uuid in any of usr local go src github com gofrs uuid from goroot tmpfs src gopath src github com gofrs uuid from gopath asset quickstart batch get assets history main go cannot find package cloud google com go asset in any of usr local go src cloud google com go asset from goroot tmpfs src gopath src cloud google com go asset from gopath asset quickstart batch get assets history main go cannot find package google golang org genproto googleapis cloud asset in any of usr local go src google golang org genproto googleapis cloud asset from goroot tmpfs src gopath src google golang org genproto googleapis cloud asset from gopath container registry container analysis src sample samples go cannot find package cloud google com go containeranalysis in any of usr local go src cloud google com go containeranalysis from goroot tmpfs src gopath src cloud google com go containeranalysis from gopath container registry container analysis src sample samples go cannot find package google golang org genproto googleapis devtools containeranalysis grafeas in any of usr local go src google golang org genproto googleapis devtools containeranalysis grafeas from goroot tmpfs src gopath src google golang org genproto googleapis devtools containeranalysis grafeas from gopath container registry container analysis src sample samples go cannot find package google golang org genproto googleapis devtools containeranalysis vulnerability in any of usr local go src google golang org genproto googleapis devtools containeranalysis vulnerability from goroot tmpfs src gopath src google golang org genproto googleapis devtools containeranalysis vulnerability from gopath jobs quickstart main go cannot find package google golang org api jobs in any of usr local go src google golang org api jobs from goroot tmpfs src gopath src google golang org api jobs from gopath jobs howto auto complete sample go cannot find package google golang org api jobs in any of usr local go src google golang org api jobs from goroot tmpfs src gopath src google golang org api jobs from gopath opencensus metrics quickstart main go cannot find package contrib go opencensus io exporter stackdriver in any of usr local go src contrib go opencensus io exporter stackdriver from goroot tmpfs src gopath src contrib go opencensus io exporter stackdriver from gopath texttospeech list voices list voices go cannot find package cloud google com go texttospeech in any of usr local go src cloud google com go texttospeech from goroot tmpfs src gopath src cloud google com go texttospeech from gopath texttospeech list voices list voices go cannot find package google golang org genproto googleapis cloud texttospeech in any of usr local go src google golang org genproto googleapis cloud texttospeech from goroot tmpfs src gopath src google golang org genproto googleapis cloud texttospeech from gopath trace trace quickstart main go cannot find package contrib go opencensus io exporter stackdriver propagation in any of usr local go src contrib go opencensus io exporter stackdriver propagation from goroot tmpfs src gopath src contrib go opencensus io exporter stackdriver propagation from gopath cat gotest out tmpfs src gopath bin go junit report set exit code gimmeproj project golang samples tests done golang samples tests returned golang samples tests to the pool build finished after secs exit value
1
2,087
4,912,970,615
IssuesEvent
2016-11-23 10:52:59
Alfresco/alfresco-ng2-components
https://api.github.com/repos/Alfresco/alfresco-ng2-components
opened
If start form and first user task use same form, form is populated
browser: all bug comp: activiti-processList
1. Import below app 2. Start process 3. Enter fields in start form and complete 4. Go to task list > first user task in process **Expected results** Form attached to task is not populated **Actual results** Form attached to task is populated with that entered in step 3 **app used in testing** [simple app.zip](https://github.com/Alfresco/alfresco-ng2-components/files/608637/simple.app.zip) [populated.mp4.zip](https://github.com/Alfresco/alfresco-ng2-components/files/608611/populated.mp4.zip)
1.0
If start form and first user task use same form, form is populated - 1. Import below app 2. Start process 3. Enter fields in start form and complete 4. Go to task list > first user task in process **Expected results** Form attached to task is not populated **Actual results** Form attached to task is populated with that entered in step 3 **app used in testing** [simple app.zip](https://github.com/Alfresco/alfresco-ng2-components/files/608637/simple.app.zip) [populated.mp4.zip](https://github.com/Alfresco/alfresco-ng2-components/files/608611/populated.mp4.zip)
process
if start form and first user task use same form form is populated import below app start process enter fields in start form and complete go to task list first user task in process expected results form attached to task is not populated actual results form attached to task is populated with that entered in step app used in testing
1
97,143
20,171,073,924
IssuesEvent
2022-02-10 10:27:36
Regalis11/Barotrauma
https://api.github.com/repos/Regalis11/Barotrauma
closed
[Unstable] Can't copy/CTRL+Drag docking ports/hatches in editor
Bug Code
- [x] I have searched the issue tracker to check if the issue has already been reported. **Description** Can't copy/CTRL+Drag docking ports/hatches in editor Docking ports aren't saved in assemblies either **Steps To Reproduce** Select a docking hatch Hold CTRL and drag it Nothing happens CTRL+C CTRL+V Nothing happens **Version** v0.16.3.0
1.0
[Unstable] Can't copy/CTRL+Drag docking ports/hatches in editor - - [x] I have searched the issue tracker to check if the issue has already been reported. **Description** Can't copy/CTRL+Drag docking ports/hatches in editor Docking ports aren't saved in assemblies either **Steps To Reproduce** Select a docking hatch Hold CTRL and drag it Nothing happens CTRL+C CTRL+V Nothing happens **Version** v0.16.3.0
non_process
can t copy ctrl drag docking ports hatches in editor i have searched the issue tracker to check if the issue has already been reported description can t copy ctrl drag docking ports hatches in editor docking ports aren t saved in assemblies either steps to reproduce select a docking hatch hold ctrl and drag it nothing happens ctrl c ctrl v nothing happens version
0
538,263
15,765,652,474
IssuesEvent
2021-03-31 14:20:49
mskcc/pluto-cwl
https://api.github.com/repos/mskcc/pluto-cwl
opened
add sample concordance / discordance comparisons
high priority
Need to include information about concordant and discordant variants between a number of samples. The most basic case is 1 research sample vs. 1 clinical sample. We already have - [`samples_fillout_workflow.cwl`](https://github.com/mskcc/pluto-cwl/blob/master/cwl/samples_fillout_workflow.cwl) for doing fillout on a set of samples, each with their own maf files (variant list) - [`fillout_workflow.cwl`](https://github.com/mskcc/pluto-cwl/blob/master/cwl/fillout_workflow.cwl) I think `samples_fillout_workflow.cwl` might fit this use case, however it could also be supplemented with `bcftools isec` as per #41 and Conpair/Somalier/etc. as per #32 Need to wrap all this up into a single cohesive package as per #46
1.0
add sample concordance / discordance comparisons - Need to include information about concordant and discordant variants between a number of samples. The most basic case is 1 research sample vs. 1 clinical sample. We already have - [`samples_fillout_workflow.cwl`](https://github.com/mskcc/pluto-cwl/blob/master/cwl/samples_fillout_workflow.cwl) for doing fillout on a set of samples, each with their own maf files (variant list) - [`fillout_workflow.cwl`](https://github.com/mskcc/pluto-cwl/blob/master/cwl/fillout_workflow.cwl) I think `samples_fillout_workflow.cwl` might fit this use case, however it could also be supplemented with `bcftools isec` as per #41 and Conpair/Somalier/etc. as per #32 Need to wrap all this up into a single cohesive package as per #46
non_process
add sample concordance discordance comparisons need to include information about concordant and discordant variants between a number of samples the most basic case is research sample vs clinical sample we already have for doing fillout on a set of samples each with their own maf files variant list i think samples fillout workflow cwl might fit this use case however it could also be supplemented with bcftools isec as per and conpair somalier etc as per need to wrap all this up into a single cohesive package as per
0
51,343
6,156,240,970
IssuesEvent
2017-06-28 16:16:17
opengovfoundation/madison
https://api.github.com/repos/opengovfoundation/madison
closed
Test suite for API routes in laravel
Difficulty (3): Give me some time Needs: Laravel Type: Testing / QA
Since we are starting to modularize a bit and are making madison core just an API layer, it would be good to have a nicely defined set of routes that need to be functioning and have tests for each. Essentially these will be integration tests for just the API layer.
1.0
Test suite for API routes in laravel - Since we are starting to modularize a bit and are making madison core just an API layer, it would be good to have a nicely defined set of routes that need to be functioning and have tests for each. Essentially these will be integration tests for just the API layer.
non_process
test suite for api routes in laravel since we are starting to modularize a bit and are making madison core just an api layer it would be good to have a nicely defined set of routes that need to be functioning and have tests for each essentially these will be integration tests for just the api layer
0
375,023
26,143,349,753
IssuesEvent
2022-12-29 22:27:14
use-cardano/use-cardano
https://api.github.com/repos/use-cardano/use-cardano
closed
Where to set data within the library
documentation question
As it is now, some of the operations are performed in the components. This has the consequence that if the user opt-out of using the components, the behavior will be different. For example how warnings and errors are set. This is also a benefit, since it allows for a large degree of customization, but might be unexpected.
1.0
Where to set data within the library - As it is now, some of the operations are performed in the components. This has the consequence that if the user opt-out of using the components, the behavior will be different. For example how warnings and errors are set. This is also a benefit, since it allows for a large degree of customization, but might be unexpected.
non_process
where to set data within the library as it is now some of the operations are performed in the components this has the consequence that if the user opt out of using the components the behavior will be different for example how warnings and errors are set this is also a benefit since it allows for a large degree of customization but might be unexpected
0
289,343
8,869,222,803
IssuesEvent
2019-01-11 04:00:21
StrangeLoopGames/EcoIssues
https://api.github.com/repos/StrangeLoopGames/EcoIssues
opened
Moved a stack of Sandstone from a small stockpile to a big one, disappeared
High Priority
I moved a 12 stack of sandstone from the small stockpile to the first linked one, it disappeared. Then I dragged in the 4 I was carrying to see if it would update the number to include the 12 but it didn't. ![image](https://user-images.githubusercontent.com/431728/51012505-2a02e580-15a9-11e9-8dce-7b82bd771499.png)
1.0
Moved a stack of Sandstone from a small stockpile to a big one, disappeared - I moved a 12 stack of sandstone from the small stockpile to the first linked one, it disappeared. Then I dragged in the 4 I was carrying to see if it would update the number to include the 12 but it didn't. ![image](https://user-images.githubusercontent.com/431728/51012505-2a02e580-15a9-11e9-8dce-7b82bd771499.png)
non_process
moved a stack of sandstone from a small stockpile to a big one disappeared i moved a stack of sandstone from the small stockpile to the first linked one it disappeared then i dragged in the i was carrying to see if it would update the number to include the but it didn t
0
20,412
27,072,109,015
IssuesEvent
2023-02-14 07:54:03
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Long running Genrule can't finish when "jobs" bigger then 1, thus compilation errors out.
P4 type: support / not a bug (process) team-Rules-CPP stale
### [BUG?] Description of the problem: **TLDR**: Long running genrule dependency is not allowed to finish before the compilation of "who depends on it" starts. Thus compilation errors out as the generated files have not been created yet. Genrule takes around 20s with 4 cores to complete. Maybe not standard procedure but I am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel. For that, 0. Downloaded my project with tf_http_archive 1. I used a genrule that makes a call to cmake and make. 2. Then created a cc_library depend on this genrule output file 3. Then made the binary depend on this library. If I execute build command with multiple jobs, genrule does not have time to finish, because the compilation of the binary starts straight away. Scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete. This is my first week with bazel and I am not sure how to debug or what is going wrong. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. See files below. ### What operating system are you running Bazel on? Docker tensorflow/tensorflow:nightly-custom-op-ubuntu16 as of Dec/03/2019. bazel 1.1.0 ### What's the output of `bazel info release`? ``` INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=275 INFO: Reading rc options for 'info' from /working_dir/tensorflow/.bazelrc: Inherited 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2 INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14 release 1.1.0 ``` ### Have you found anything relevant by searching the web? Tried the `https://github.com/bazelbuild/rules_foreign_cc` but it does not fit my needs as it is complex to integrate in the framework (not using http_archive but tf_http_archive). ### Files involved: workspace.bzl ``` tf_http_archive( name = "systemc", build_file = clean_dep("//third_party:systemc.BUILD"), sha256 = "5781b9a351e5afedabc37d145e5f7edec08f3fd5de00ffeb8fa1f3086b1f7b3f", urls = [ "https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz", "https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz", ], ) ``` BUILD ``` cc_binary( name = "systemc_model", srcs = [ "systemc_main.sc.cc", ], deps = [ "@systemc//:systemc", ], ) ``` third_party/systemc.BUILD ``` licenses(["notice"]) package(default_visibility = ["//visibility:public"]) genrule( name = "libsystemc", srcs = [], outs = ["systemc-2.3.3/install/lib/libsystemc.a"], cmd = "cmake -DCMAKE_INSTALL_PREFIX=external/systemc/systemc-2.3.3/install -DCMAKE_CXX_STANDARD=14 -DCMAKE_INSTALL_INCLUDEDIR=include/systemc -DBUILD_SHARED_LIBS=off -Bexternal/systemc/systemc-2.3.3/build -Hexternal/systemc/systemc-2.3.3 &&" + "make -C external/systemc/systemc-2.3.3/build install -j4 &&" + "cp external/systemc/systemc-2.3.3/install/lib/libsystemc.a $@", ) cc_library( name = "systemc", srcs = ["systemc-2.3.3/install/lib/libsystemc.a"], hdrs = glob([ "systemc-2.3.3/install/include/system.h", ]), copts = ["std=c++14"], data = [":libsystemc"], includes = [ "systemc-2.3.3/install/include", "systemc-2.3.3/install/include/systemc", ], ) ``` Success messages when running `bazel build --jobs 1 --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model` ``` Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --jobs=1 --explain=file.txt --verbose_explanations Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested. Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new). Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new). Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new). Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new). Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new). Executing action 'Linking tensorflow/lite/tools/systemc/systemc_model': no entry in the cache (action is new). ``` Error messages when running unlimited jobs `bazel build --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model` ``` Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --explain=file.txt --verbose_explanations Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested. Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new). Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new). Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new). Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new). Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new). ``` and ``` Starting local Bazel server and connecting to it... INFO: Writing tracer profile to '/home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/command.profile.gz' INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=275 INFO: Reading rc options for 'build' from /working_dir/tensorflow/.bazelrc: 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2 INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14 DEBUG: Rule 'io_bazel_rules_docker' indicated that a canonical reproducible form can be obtained by modifying arguments shallow_since = "1556410077 -0400" DEBUG: Call stack for the definition of repository 'io_bazel_rules_docker' which is a git_repository (rule definition at /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_tools/tools/build_defs/repo/git.bzl:195:18): - /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_toolchains/repositories/repositories.bzl:37:9 - /working_dir/tensorflow/WORKSPACE:37:1 INFO: Analyzed target //tensorflow/lite/tools/systemc:systemc_model (19 packages loaded, 69 targets configured). INFO: Found 1 target... INFO: Writing explanation of rebuilds to 'file.txt' ERROR: /working_dir/tensorflow/tensorflow/lite/tools/systemc/BUILD:22:1: C++ compilation of rule '//tensorflow/lite/tools/systemc:systemc_model' failed (Exit 1) tensorflow/lite/tools/systemc/systemc_main.sc.cc:3:29: fatal error: systemc/systemc.h: No such file or directory compilation terminated. Target //tensorflow/lite/tools/systemc:systemc_model failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 3.761s, Critical Path: 0.04s INFO: 0 processes. FAILED: Build did NOT complete successfully ```
1.0
Long running Genrule can't finish when "jobs" bigger then 1, thus compilation errors out. - ### [BUG?] Description of the problem: **TLDR**: Long running genrule dependency is not allowed to finish before the compilation of "who depends on it" starts. Thus compilation errors out as the generated files have not been created yet. Genrule takes around 20s with 4 cores to complete. Maybe not standard procedure but I am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel. For that, 0. Downloaded my project with tf_http_archive 1. I used a genrule that makes a call to cmake and make. 2. Then created a cc_library depend on this genrule output file 3. Then made the binary depend on this library. If I execute build command with multiple jobs, genrule does not have time to finish, because the compilation of the binary starts straight away. Scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete. This is my first week with bazel and I am not sure how to debug or what is going wrong. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. See files below. ### What operating system are you running Bazel on? Docker tensorflow/tensorflow:nightly-custom-op-ubuntu16 as of Dec/03/2019. bazel 1.1.0 ### What's the output of `bazel info release`? ``` INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=275 INFO: Reading rc options for 'info' from /working_dir/tensorflow/.bazelrc: Inherited 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2 INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14 release 1.1.0 ``` ### Have you found anything relevant by searching the web? Tried the `https://github.com/bazelbuild/rules_foreign_cc` but it does not fit my needs as it is complex to integrate in the framework (not using http_archive but tf_http_archive). ### Files involved: workspace.bzl ``` tf_http_archive( name = "systemc", build_file = clean_dep("//third_party:systemc.BUILD"), sha256 = "5781b9a351e5afedabc37d145e5f7edec08f3fd5de00ffeb8fa1f3086b1f7b3f", urls = [ "https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz", "https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz", ], ) ``` BUILD ``` cc_binary( name = "systemc_model", srcs = [ "systemc_main.sc.cc", ], deps = [ "@systemc//:systemc", ], ) ``` third_party/systemc.BUILD ``` licenses(["notice"]) package(default_visibility = ["//visibility:public"]) genrule( name = "libsystemc", srcs = [], outs = ["systemc-2.3.3/install/lib/libsystemc.a"], cmd = "cmake -DCMAKE_INSTALL_PREFIX=external/systemc/systemc-2.3.3/install -DCMAKE_CXX_STANDARD=14 -DCMAKE_INSTALL_INCLUDEDIR=include/systemc -DBUILD_SHARED_LIBS=off -Bexternal/systemc/systemc-2.3.3/build -Hexternal/systemc/systemc-2.3.3 &&" + "make -C external/systemc/systemc-2.3.3/build install -j4 &&" + "cp external/systemc/systemc-2.3.3/install/lib/libsystemc.a $@", ) cc_library( name = "systemc", srcs = ["systemc-2.3.3/install/lib/libsystemc.a"], hdrs = glob([ "systemc-2.3.3/install/include/system.h", ]), copts = ["std=c++14"], data = [":libsystemc"], includes = [ "systemc-2.3.3/install/include", "systemc-2.3.3/install/include/systemc", ], ) ``` Success messages when running `bazel build --jobs 1 --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model` ``` Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --jobs=1 --explain=file.txt --verbose_explanations Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested. Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new). Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new). Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new). Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new). Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new). Executing action 'Linking tensorflow/lite/tools/systemc/systemc_model': no entry in the cache (action is new). ``` Error messages when running unlimited jobs `bazel build --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model` ``` Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --explain=file.txt --verbose_explanations Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested. Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new). Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new). Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new). Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new). Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new). ``` and ``` Starting local Bazel server and connecting to it... INFO: Writing tracer profile to '/home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/command.profile.gz' INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=275 INFO: Reading rc options for 'build' from /working_dir/tensorflow/.bazelrc: 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2 INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14 DEBUG: Rule 'io_bazel_rules_docker' indicated that a canonical reproducible form can be obtained by modifying arguments shallow_since = "1556410077 -0400" DEBUG: Call stack for the definition of repository 'io_bazel_rules_docker' which is a git_repository (rule definition at /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_tools/tools/build_defs/repo/git.bzl:195:18): - /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_toolchains/repositories/repositories.bzl:37:9 - /working_dir/tensorflow/WORKSPACE:37:1 INFO: Analyzed target //tensorflow/lite/tools/systemc:systemc_model (19 packages loaded, 69 targets configured). INFO: Found 1 target... INFO: Writing explanation of rebuilds to 'file.txt' ERROR: /working_dir/tensorflow/tensorflow/lite/tools/systemc/BUILD:22:1: C++ compilation of rule '//tensorflow/lite/tools/systemc:systemc_model' failed (Exit 1) tensorflow/lite/tools/systemc/systemc_main.sc.cc:3:29: fatal error: systemc/systemc.h: No such file or directory compilation terminated. Target //tensorflow/lite/tools/systemc:systemc_model failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 3.761s, Critical Path: 0.04s INFO: 0 processes. FAILED: Build did NOT complete successfully ```
process
long running genrule can t finish when jobs bigger then thus compilation errors out description of the problem tldr long running genrule dependency is not allowed to finish before the compilation of who depends on it starts thus compilation errors out as the generated files have not been created yet genrule takes around with cores to complete maybe not standard procedure but i am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel for that downloaded my project with tf http archive i used a genrule that makes a call to cmake and make then created a cc library depend on this genrule output file then made the binary depend on this library if i execute build command with multiple jobs genrule does not have time to finish because the compilation of the binary starts straight away scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete this is my first week with bazel and i am not sure how to debug or what is going wrong bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible see files below what operating system are you running bazel on docker tensorflow tensorflow nightly custom op as of dec bazel what s the output of bazel info release info options provided by the client inherited common options isatty terminal columns info reading rc options for info from working dir tensorflow bazelrc inherited build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone c opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config config info found applicable config definition build in file working dir tensorflow bazelrc define tf api version action env behavior info found applicable config definition build linux in file working dir tensorflow bazelrc copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c release have you found anything relevant by searching the web tried the but it does not fit my needs as it is complex to integrate in the framework not using http archive but tf http archive files involved workspace bzl tf http archive name systemc build file clean dep third party systemc build urls build cc binary name systemc model srcs systemc main sc cc deps systemc systemc third party systemc build licenses package default visibility genrule name libsystemc srcs outs cmd cmake dcmake install prefix external systemc systemc install dcmake cxx standard dcmake install includedir include systemc dbuild shared libs off bexternal systemc systemc build hexternal systemc systemc make c external systemc systemc build install cp external systemc systemc install lib libsystemc a cc library name systemc srcs hdrs glob systemc install include system h copts data includes systemc install include systemc install include systemc success messages when running bazel build jobs explain file txt verbose explanations tensorflow lite tools systemc systemc model build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone compilation mode opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c config define tf api version action env behavior jobs explain file txt verbose explanations executing action bazelworkspacestatusaction stable status txt unconditional execution is requested executing action executing genrule systemc libsystemc no entry in the cache action is new executing action creating source manifest for tensorflow lite tools systemc systemc model no entry in the cache action is new executing action creating runfiles tree bazel out opt bin tensorflow lite tools systemc systemc model runfiles no entry in the cache action is new executing action writing file tensorflow lite tools systemc systemc model params no entry in the cache action is new executing action compiling tensorflow lite tools systemc systemc main sc cc no entry in the cache action is new executing action linking tensorflow lite tools systemc systemc model no entry in the cache action is new error messages when running unlimited jobs bazel build explain file txt verbose explanations tensorflow lite tools systemc systemc model build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone compilation mode opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c config define tf api version action env behavior explain file txt verbose explanations executing action bazelworkspacestatusaction stable status txt unconditional execution is requested executing action creating source manifest for tensorflow lite tools systemc systemc model no entry in the cache action is new executing action writing file tensorflow lite tools systemc systemc model params no entry in the cache action is new executing action executing genrule systemc libsystemc no entry in the cache action is new executing action compiling tensorflow lite tools systemc systemc main sc cc no entry in the cache action is new executing action creating runfiles tree bazel out opt bin tensorflow lite tools systemc systemc model runfiles no entry in the cache action is new and starting local bazel server and connecting to it info writing tracer profile to home developer cache bazel bazel developer command profile gz info options provided by the client inherited common options isatty terminal columns info reading rc options for build from working dir tensorflow bazelrc build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone c opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config config info found applicable config definition build in file working dir tensorflow bazelrc define tf api version action env behavior info found applicable config definition build linux in file working dir tensorflow bazelrc copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c debug rule io bazel rules docker indicated that a canonical reproducible form can be obtained by modifying arguments shallow since debug call stack for the definition of repository io bazel rules docker which is a git repository rule definition at home developer cache bazel bazel developer external bazel tools tools build defs repo git bzl home developer cache bazel bazel developer external bazel toolchains repositories repositories bzl working dir tensorflow workspace info analyzed target tensorflow lite tools systemc systemc model packages loaded targets configured info found target info writing explanation of rebuilds to file txt error working dir tensorflow tensorflow lite tools systemc build c compilation of rule tensorflow lite tools systemc systemc model failed exit tensorflow lite tools systemc systemc main sc cc fatal error systemc systemc h no such file or directory compilation terminated target tensorflow lite tools systemc systemc model failed to build use verbose failures to see the command lines of failed build steps info elapsed time critical path info processes failed build did not complete successfully
1
113,269
9,634,378,739
IssuesEvent
2019-05-15 21:04:55
saltstack/salt
https://api.github.com/repos/saltstack/salt
closed
unittest.loader._FailedTest.unit.cloud.clouds.test_joyent py3 failing
2019.2.1 Blocker Test Failure
The test `unittest.loader._FailedTest.unit.cloud.clouds.test_joyent` is failing on: 2019.2.1 failed salt-centos-7-py3, salt-centos-7-py3-pycryptodomex, salt-centos-7-py3-transport, salt-debian-9-py3, salt-fedora-28-py3, salt -fedora-29-py3, salt-opensuse-42-py3, salt-ubuntu-1604-py3, salt-ubuntu-1604-py3-pycryptodomex, salt-ubuntu-1604-py3-transport, salt-ubuntu- 1804-py3, salt-windows-2016-py3 https://jenkinsci.saltstack.com/job/2019.2.1/view/Python3/job/salt-centos-7-py3/7/testReport/junit/unittest.loader._FailedTest.unit.cloud/clouds/test_joyent/ ``` Traceback (most recent call last): File "/usr/lib64/python3.4/unittest/case.py", line 59, in testPartExecutor yield File "/usr/lib64/python3.4/unittest/case.py", line 618, in run testMethod() File "/usr/lib64/python3.4/unittest/loader.py", line 33, in testFailure raise self._exception ImportError: Failed to import test module: unit.cloud.clouds.test_joyent Traceback (most recent call last): File "/tmp/kitchen/testing/salt/cloud/clouds/joyent.py", line 63, in <module> from M2Crypto import EVP ImportError: No module named 'M2Crypto' ```
1.0
unittest.loader._FailedTest.unit.cloud.clouds.test_joyent py3 failing - The test `unittest.loader._FailedTest.unit.cloud.clouds.test_joyent` is failing on: 2019.2.1 failed salt-centos-7-py3, salt-centos-7-py3-pycryptodomex, salt-centos-7-py3-transport, salt-debian-9-py3, salt-fedora-28-py3, salt -fedora-29-py3, salt-opensuse-42-py3, salt-ubuntu-1604-py3, salt-ubuntu-1604-py3-pycryptodomex, salt-ubuntu-1604-py3-transport, salt-ubuntu- 1804-py3, salt-windows-2016-py3 https://jenkinsci.saltstack.com/job/2019.2.1/view/Python3/job/salt-centos-7-py3/7/testReport/junit/unittest.loader._FailedTest.unit.cloud/clouds/test_joyent/ ``` Traceback (most recent call last): File "/usr/lib64/python3.4/unittest/case.py", line 59, in testPartExecutor yield File "/usr/lib64/python3.4/unittest/case.py", line 618, in run testMethod() File "/usr/lib64/python3.4/unittest/loader.py", line 33, in testFailure raise self._exception ImportError: Failed to import test module: unit.cloud.clouds.test_joyent Traceback (most recent call last): File "/tmp/kitchen/testing/salt/cloud/clouds/joyent.py", line 63, in <module> from M2Crypto import EVP ImportError: No module named 'M2Crypto' ```
non_process
unittest loader failedtest unit cloud clouds test joyent failing the test unittest loader failedtest unit cloud clouds test joyent is failing on failed salt centos salt centos pycryptodomex salt centos transport salt debian salt fedora salt fedora salt opensuse salt ubuntu salt ubuntu pycryptodomex salt ubuntu transport salt ubuntu salt windows traceback most recent call last file usr unittest case py line in testpartexecutor yield file usr unittest case py line in run testmethod file usr unittest loader py line in testfailure raise self exception importerror failed to import test module unit cloud clouds test joyent traceback most recent call last file tmp kitchen testing salt cloud clouds joyent py line in from import evp importerror no module named
0
141,803
18,990,089,485
IssuesEvent
2021-11-22 05:42:17
ChoeMinji/tensorflow-2.2.3
https://api.github.com/repos/ChoeMinji/tensorflow-2.2.3
opened
CVE-2015-4852 (High) detected in commons-collections-3.2.1.jar
security vulnerability
## CVE-2015-4852 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-collections-3.2.1.jar</b></p></summary> <p>Types that extend and augment the Java Collections Framework.</p> <p>Path to dependency file: tensorflow-2.2.3/tensorflow/java/maven/tensorflow-hadoop/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar</p> <p> Dependency Hierarchy: - hadoop-common-2.6.0.jar (Root Library) - :x: **commons-collections-3.2.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/tensorflow-2.2.3/commit/1f65fd168afc52c040a47230bb3cb902f7223124">1f65fd168afc52c040a47230bb3cb902f7223124</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The WLS Security component in Oracle WebLogic Server 10.3.6.0, 12.1.2.0, 12.1.3.0, and 12.2.1.0 allows remote attackers to execute arbitrary commands via a crafted serialized Java object in T3 protocol traffic to TCP port 7001, related to oracle_common/modules/com.bea.core.apache.commons.collections.jar. NOTE: the scope of this CVE is limited to the WebLogic Server product. <p>Publish Date: 2015-11-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-4852>CVE-2015-4852</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.openwall.com/lists/oss-security/2015/11/17/19">https://www.openwall.com/lists/oss-security/2015/11/17/19</a></p> <p>Release Date: 2015-11-18</p> <p>Fix Resolution: commons-collections:commons-collections:3.2.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2015-4852 (High) detected in commons-collections-3.2.1.jar - ## CVE-2015-4852 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-collections-3.2.1.jar</b></p></summary> <p>Types that extend and augment the Java Collections Framework.</p> <p>Path to dependency file: tensorflow-2.2.3/tensorflow/java/maven/tensorflow-hadoop/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar</p> <p> Dependency Hierarchy: - hadoop-common-2.6.0.jar (Root Library) - :x: **commons-collections-3.2.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/tensorflow-2.2.3/commit/1f65fd168afc52c040a47230bb3cb902f7223124">1f65fd168afc52c040a47230bb3cb902f7223124</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The WLS Security component in Oracle WebLogic Server 10.3.6.0, 12.1.2.0, 12.1.3.0, and 12.2.1.0 allows remote attackers to execute arbitrary commands via a crafted serialized Java object in T3 protocol traffic to TCP port 7001, related to oracle_common/modules/com.bea.core.apache.commons.collections.jar. NOTE: the scope of this CVE is limited to the WebLogic Server product. <p>Publish Date: 2015-11-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-4852>CVE-2015-4852</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.openwall.com/lists/oss-security/2015/11/17/19">https://www.openwall.com/lists/oss-security/2015/11/17/19</a></p> <p>Release Date: 2015-11-18</p> <p>Fix Resolution: commons-collections:commons-collections:3.2.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in commons collections jar cve high severity vulnerability vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file tensorflow tensorflow java maven tensorflow hadoop pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hadoop common jar root library x commons collections jar vulnerable library found in head commit a href found in base branch master vulnerability details the wls security component in oracle weblogic server and allows remote attackers to execute arbitrary commands via a crafted serialized java object in protocol traffic to tcp port related to oracle common modules com bea core apache commons collections jar note the scope of this cve is limited to the weblogic server product publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections step up your open source security game with whitesource
0
34,197
4,892,968,913
IssuesEvent
2016-11-18 21:28:04
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Failure in AppDomainTests.GetAssemblies due to FileNotFoundException
area-System.Runtime test-run-core
This failed in CI, on CentOS https://ci.dot.net/job/dotnet_corefx/job/master/job/centos7.1_debug_prtest/280/consoleText ``` System.Tests.AppDomainTests.GetAssemblies [FAIL] System.IO.FileNotFoundException : Could not load file or assembly 'TestAppOutsideOfTPA, Version=4.2.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. The system cannot find the file specified. Stack Trace: at System.Reflection.RuntimeAssembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks, IntPtr ptrLoadContextBinder) at System.Reflection.RuntimeAssembly.InternalLoadAssemblyName(AssemblyName assemblyRef, Evidence assemblySecurity, RuntimeAssembly reqAssembly, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks, IntPtr ptrLoadContextBinder) at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean forIntrospection) at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.Reflection.Assembly.Load(String assemblyString) at System.AppDomain.Load(String assemblyString) /mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_debug_prtest/src/System.Runtime.Extensions/tests/System/AppDomainTests.cs(405,0): at System.Tests.AppDomainTests.GetAssemblies() ```
1.0
Failure in AppDomainTests.GetAssemblies due to FileNotFoundException - This failed in CI, on CentOS https://ci.dot.net/job/dotnet_corefx/job/master/job/centos7.1_debug_prtest/280/consoleText ``` System.Tests.AppDomainTests.GetAssemblies [FAIL] System.IO.FileNotFoundException : Could not load file or assembly 'TestAppOutsideOfTPA, Version=4.2.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. The system cannot find the file specified. Stack Trace: at System.Reflection.RuntimeAssembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks, IntPtr ptrLoadContextBinder) at System.Reflection.RuntimeAssembly.InternalLoadAssemblyName(AssemblyName assemblyRef, Evidence assemblySecurity, RuntimeAssembly reqAssembly, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks, IntPtr ptrLoadContextBinder) at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean forIntrospection) at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.Reflection.Assembly.Load(String assemblyString) at System.AppDomain.Load(String assemblyString) /mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_debug_prtest/src/System.Runtime.Extensions/tests/System/AppDomainTests.cs(405,0): at System.Tests.AppDomainTests.GetAssemblies() ```
non_process
failure in appdomaintests getassemblies due to filenotfoundexception this failed in ci on centos system tests appdomaintests getassemblies system io filenotfoundexception could not load file or assembly testappoutsideoftpa version culture neutral publickeytoken the system cannot find the file specified stack trace at system reflection runtimeassembly nload assemblyname filename string codebase evidence assemblysecurity runtimeassembly locationhint stackcrawlmark stackmark intptr pprivhostbinder boolean throwonfilenotfound boolean forintrospection boolean suppresssecuritychecks intptr ptrloadcontextbinder at system reflection runtimeassembly internalloadassemblyname assemblyname assemblyref evidence assemblysecurity runtimeassembly reqassembly stackcrawlmark stackmark intptr pprivhostbinder boolean throwonfilenotfound boolean forintrospection boolean suppresssecuritychecks intptr ptrloadcontextbinder at system reflection runtimeassembly internalload string assemblystring evidence assemblysecurity stackcrawlmark stackmark intptr pprivhostbinder boolean forintrospection at system reflection runtimeassembly internalload string assemblystring evidence assemblysecurity stackcrawlmark stackmark boolean forintrospection at system reflection assembly load string assemblystring at system appdomain load string assemblystring mnt resource j workspace dotnet corefx master debug prtest src system runtime extensions tests system appdomaintests cs at system tests appdomaintests getassemblies
0
423,035
12,289,772,904
IssuesEvent
2020-05-09 23:24:24
jwcorle/425_FinalProject
https://api.github.com/repos/jwcorle/425_FinalProject
opened
More/less damage to limbs
enhancement low priority
Goal: When a user fires a bullet and it impacts the different body parts (legs, arms, chest, head) to be able to keep track of these and deal the appropriate amount of damage. Possible Solutions: Start with just the head, splitting the existing capsule collider into an additional spherical collider for the head and integrate it with the `EnemyController` and `WeaponController` (deal more damage when `hit.gameObject.CompareTag("head")` or something)
1.0
More/less damage to limbs - Goal: When a user fires a bullet and it impacts the different body parts (legs, arms, chest, head) to be able to keep track of these and deal the appropriate amount of damage. Possible Solutions: Start with just the head, splitting the existing capsule collider into an additional spherical collider for the head and integrate it with the `EnemyController` and `WeaponController` (deal more damage when `hit.gameObject.CompareTag("head")` or something)
non_process
more less damage to limbs goal when a user fires a bullet and it impacts the different body parts legs arms chest head to be able to keep track of these and deal the appropriate amount of damage possible solutions start with just the head splitting the existing capsule collider into an additional spherical collider for the head and integrate it with the enemycontroller and weaponcontroller deal more damage when hit gameobject comparetag head or something
0
130,837
27,773,592,077
IssuesEvent
2023-03-16 15:50:17
modin-project/modin
https://api.github.com/repos/modin-project/modin
closed
CLN: FutureWarning: the `mangle_dupe_cols` keyword is deprecated for `read_csv`
Code Quality 💯
This warning appears for `hdk` engine.
1.0
CLN: FutureWarning: the `mangle_dupe_cols` keyword is deprecated for `read_csv` - This warning appears for `hdk` engine.
non_process
cln futurewarning the mangle dupe cols keyword is deprecated for read csv this warning appears for hdk engine
0
530,710
15,436,101,012
IssuesEvent
2021-03-07 11:42:52
AY2021S2-CS2103T-W13-2/tp
https://api.github.com/repos/AY2021S2-CS2103T-W13-2/tp
closed
Add reader
priority.High type.Story
As an individual operating a private book loaning service, I can add a new reader to the system, so that I can keep track of the readers.
1.0
Add reader - As an individual operating a private book loaning service, I can add a new reader to the system, so that I can keep track of the readers.
non_process
add reader as an individual operating a private book loaning service i can add a new reader to the system so that i can keep track of the readers
0
168,768
14,172,566,656
IssuesEvent
2020-11-12 17:06:51
dso-toolkit/dso-toolkit
https://api.github.com/repos/dso-toolkit/dso-toolkit
closed
Issue #835 moet als BREAKING in de CHANGELOG staan
status:done type:documentation
Ook als #866 gefixt is, blijft het een feit dat je iets moet doen.
1.0
Issue #835 moet als BREAKING in de CHANGELOG staan - Ook als #866 gefixt is, blijft het een feit dat je iets moet doen.
non_process
issue moet als breaking in de changelog staan ook als gefixt is blijft het een feit dat je iets moet doen
0